BRN Discussion Ongoing

The following research article is quite interesting because what it does in its conclusion is point out why the type of LIF neurons forming the basis of Intel and others approach to SNN are fraught with complexity and high error rates. Noting that Intel and DARPA are listed among the sponsors gives this opinion some additional credence:


“5 Conclusion
SNNs have been considered as a potential solution for the low-power machine intelligence due to their event-driven nature of computation and the inherent recurrence that helps retain information over time. However, practical applica- tions of SNNs have not been well demonstrated due to an improper task selection and the vanishing gradient problem. In this work, we proposed SNNs with improved inherent recurrence dynamics that are able to effectively learn long sequences. The benefit of the proposed architectures is 2× reduction in number of the trainable parameters compared to the LSTMs. Our training scheme to train the proposed architectures allows SNNs to produce multiple-bit outputs (as opposed to simple binary spikes) and help with gradient mismatch issue that occurs due to the use of surrogate func- tion to overcome spiking neurons’ non-differentiability. We showed that SNNs with improved inherent recurrence dy- namics reduce the gap in speech recognition performance from LSTMs and GRUs to 1.10% and 0.36% on TIMIT and LibriSpeech 100h dataset. We also demonstrated that improved SNNs lead to 10.13-11.14× savings in multi- plication operations over standard GRUs on TIMIT and LibriSpeech 100h speech recognition problem. This work serves as an example of how the inherent recurrence of SNNs can be used to effectively learn long temporal se- quences for applications on edge computing platforms.
Prediction accuracy (%)
100 75 50 25 0
32-bit 5-bit 3-bit 1-bit
100 75 50 25
Large difference
0 1-bit
3-bit
5-bit 32-bit
Output precision (a)
1x
1.5x
2x
3x
Number of neurons/layer (b)

Acknowledgements
This work was supported in part by the Center for Brain Inspired Computing (C-BRIC)—one of the six centers in Joint University Microelectronics Program (JUMP), in part by the Semiconductor Research Corporation (SRC) Program sponsored by Defense Advanced Research Projects Agency (DARPA), in part by Semiconductor Research Corporation, in part by the National Science Foundation, in part by Intel Corporation, in part by the Department of Defense (DoD) Vannevar Bush Fellowship, in part by the U.S. Army Re- search Laboratory, and in part by the U.K. Ministry of De- fence under Agreement W911NF-16-3-0001”


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 23 users

Diogenese

Top 20
#17,113
The following research article is quite interesting because what it does in its conclusion is point out why the type of LIF neurons forming the basis of Intel and others approach to SNN are fraught with complexity and high error rates. Noting that Intel and DARPA are listed among the sponsors gives this opinion some additional credence:


“5 Conclusion
SNNs have been considered as a potential solution for the low-power machine intelligence due to their event-driven nature of computation and the inherent recurrence that helps retain information over time. However, practical applica- tions of SNNs have not been well demonstrated due to an improper task selection and the vanishing gradient problem. In this work, we proposed SNNs with improved inherent recurrence dynamics that are able to effectively learn long sequences. The benefit of the proposed architectures is 2× reduction in number of the trainable parameters compared to the LSTMs. Our training scheme to train the proposed architectures allows SNNs to produce multiple-bit outputs (as opposed to simple binary spikes) and help with gradient mismatch issue that occurs due to the use of surrogate func- tion to overcome spiking neurons’ non-differentiability. We showed that SNNs with improved inherent recurrence dy- namics reduce the gap in speech recognition performance from LSTMs and GRUs to 1.10% and 0.36% on TIMIT and LibriSpeech 100h dataset. We also demonstrated that improved SNNs lead to 10.13-11.14× savings in multi- plication operations over standard GRUs on TIMIT and LibriSpeech 100h speech recognition problem. This work serves as an example of how the inherent recurrence of SNNs can be used to effectively learn long temporal se- quences for applications on edge computing platforms.
Prediction accuracy (%)
100 75 50 25 0
32-bit 5-bit 3-bit 1-bit
100 75 50 25
Large difference
0 1-bit
3-bit
5-bit 32-bit
Output precision (a)
1x
1.5x
2x
3x
Number of neurons/layer (b)

Acknowledgements
This work was supported in part by the Center for Brain Inspired Computing (C-BRIC)—one of the six centers in Joint University Microelectronics Program (JUMP), in part by the Semiconductor Research Corporation (SRC) Program sponsored by Defense Advanced Research Projects Agency (DARPA), in part by Semiconductor Research Corporation, in part by the National Science Foundation, in part by Intel Corporation, in part by the Department of Defense (DoD) Vannevar Bush Fellowship, in part by the U.S. Army Re- search Laboratory, and in part by the U.K. Ministry of De- fence under Agreement W911NF-16-3-0001”


My opinion only DYOR
FF

AKIDA BALLISTA
Hi FF,

I haven't read this paper, but a glance at the references shows that the Brainchip University AI course cannot come soon enough. The authors of academic papers only cite peer reviewed papers, presumably because other publications are considered unproven, so, once universities start turning out peer reviewed papers about Akida, we may see some knock-on papers citing real-world technology.

The authors seem to have come to the conclusion that sparsity saves power:

Computational Saving
Columns 6-7 of Table 1 report the percentage of zeros in the outputs from all architectures. Only a few outputs from the LSTMs and GRUs were zeros because there was no constraint on how the outputs were generated. SNNs, on the other hand, had many zero outputs because of thresholds that gated small membrane potentials. To support our claim that those zero outputs potentially lead to substantial computation saving on event-driven hardware, we measured the average number of multiplication operations per inference from all architectures and normalized each value to the measurement from LSTMs as illustrated in Column 8-9

...
We also demonstrated that improved SNNs lead to 10.13-11.14× savings in multiplication operations over standard GRUs on TIMIT and LibriSpeech 100h speech recognition problem
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Vojnovic

Regular
Forecast the other day. I put in the green candle. LOL

View attachment 15949

Update today.

View attachment 15950
I too think is up from here, and once it reaches $1.35, it will form a reverse H&S, where TA then predicts a rise to $1.9. I wonder if it will then bounce back (double top) and arch back (forming cup and handle), or break out entirely. Fingers crossed.
 
  • Like
  • Fire
Reactions: 20 users

Makeme 2020

Regular
Interesting Short video on Edge AI at the Embedded world 2022....
 
  • Like
Reactions: 7 users

Adam

Regular
  • Like
  • Thinking
Reactions: 3 users

Dozzaman1977

Regular
Found this article in a Facebook group. Surprised noone there had heard of Brainchip. Thoughts, guys?https://www.freethink.com/technolog...utm_medium=social&utm_campaign=BigThinkdotcom
Already discussed here
Research chip
Completly different and inferior to akida on every level
No competition to akida
dragons den competition GIF by CBC
 
  • Like
  • Love
  • Fire
Reactions: 14 users
I have been comparing by close analysis the following two announcements.

The first announcement extract is from the Brainchip Prophesee press release.

The second announcement is from the SynSense Prophesee press release.

We have had much discussion on the first but not on the second but there is in my opinion a valuable insight to be had by doing so.

In the Brainchip Prophesee release Prophesee states “By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs”… This is a clear statement that responsibility for the marketing and commercialisation falls entirely to Prophesee. As we all would expect as the CEO Sean Hehir has removed all doubt that Brainchip is pure and simple an IP seller it does not make chips or sensors full stop.

Whereas in the press release by SynSense and Prophesee
the following is stated “The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications”. The commercialisation of their combined solution falls entirely to SynSense and is restricted to the IOT and Smarthome markets for detection and gesture control.

The difference is with respect between these to arrangements is highly significant or as Big Kev would say HUGE.


Prophesee has seen such value in AKIDA they have taken on all the responsibility for commercialisation and has not placed any restrictions on the markets where it will be offered to OEMs.

While on the other hand Prophesee has said to SynSense yes you can use our sensor and try to commercialise it in the IOT and Smarthome markets for detection and gesture and you can market our name with your solution.

These two partnerships are chalk (SynSense) and the best aged cheddar with a fine wine (Brainchip) in comparison.

Brainchip Prophesee joint press release (extract):

“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”



SynSense Prophesee press release (extract):

“Prophesee broke new ground with the industry first commercial neuromorphic-based vision sensing platform. Its Metavision solutions use an event-based vision approach which delivers significant improvement over traditional frame-based acquisition methods, reducing power and data processing requirements while allowing operation in the most demanding conditions.
“We couldn’t be more excited to enter into a strategic partnership with Prophesee,” said Ning Qiao, CEO of SynSense. “Both companies are world leaders in their respective fields. The deep cooperation between us will promote the development of vision-based neuromorphic intelligence and will accelerate the commercialization of neuromorphic technology.”
Through this partnership SynSense will benefit from Prophesee’s sensing technology and wide existing network of partners. Prophesee will benefit from deep integration with SynSense’s novel processing technology, and world-leading developments in low-power vision applications.
Event-based vision is a paradigm-shifting advancement that is based on how the human eye records and interprets visual inputs. The sensors facilitate machine vision by recording changes in the scene rather than recording the entire scene at regular intervals. Specific advantages over frame-based approaches include better low light response (<1lux) and dynamic range (>120dB), reduced data generation (10x-1000x less than conventional approaches) leading to lower transfer/processing requirements, and higher temporal resolution (microsecond time resolution, i.e. >10k images per second time resolution equivalent).
With Prophesee’s patented Event-Based Metavision sensors, each pixel embeds its own intelligence processing enabling them to activate themselves independently, triggering events.
“As IoT applications in smart homes, offices, industries and cities proliferate, there is an increasing need for more intelligence on the edge. Our experience in implementing neuromorphic enabled vision sensing with smart pixel methods complements SynSense’s expertise in neuromorphic processing to make implementing high performance vision more practical. Together we can drive the development of highly efficient, lower cost products that provide more visibility, safety and productivity in a wide range of use cases.” said Luca Verre, CEO and co-founder of Prophesee.
The partnership will address the design, development, manufacturing and commercialization of the combined neuromorphic technology, including sensors, processing solutions, software and solutions to enable a broad range of applications
“Applying neuromorphic techniques to vision applications represents large market opportunity in many different sector. A recent report by Yole Développement forecasts that neuromorphic computing and sensing will represent between 15% and 20% of total AI computing revenue in 2035, about a roughly $20B market,” said Ning Qiao, CEO of SynSense.
The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications."

We all know thanks to @Diogenese that the SynSense solution might have a place but not as a competitor with Brainchip's AKIDA technology as it is very much a poor man's attempt at CNN neuromorphic computation.

This has been clearly recognised by Prophesee as while happy for SynSense to have a go they have recognised that Brainchip's AKIDA allows them to be "better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings"

PROPHESEE
HAVE TAKEN ON FULL RESPONSIBILITY FOR CREATING AND MARKETING THE AKIDA PROPHESEE SOLUTION.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 76 users

Taproot

Regular
SiMa AI.............This could well be another NDA that's popped up out of nowhere.

Anyone else have that feeling.... apart from Lou, for example??

Tech....:rolleyes:
To me it feels like an " Edge Impulse / Prophesee " type of vibe. I think this mob will show up as a partner soon enough.
Just going back to @Bravo
Synopsys Corporate Overview ( May 22 ) with the picture of Brainchip's Accelerator.
Things started to happen around that time when the accelerator was released in 2017
This bloke has been mentioned in the past " Christoph Fritsch "
“BrainChip’s spiking neural network technology is unique in its ability to provide outstanding performance while avoiding the math-intensive, power-hungry, and high-cost downsides of deep learning in convolutional neural networks,” says Christoph Fritsch, senior director for the industrial, scientific, and medical business at Xilinx.

BrainChip representatives claim that the BrainChip accelerator is the first commercial implementation of a hardware-accelerated spiking neural network system.

This article was released in September 2017. Christoph Fritsch left Xilinx in November 2017 and is now and has been since the :

Director WW Solution Architects for Robotics, Smart Cities, IoT and Edge AI at NVIDIA

This article was released in May 2020

Leading US semiconductor company Xilinx and Stuttgart-based multinational automotive manufacturing firm Daimler AG have signed a new partnership to develop an in-car AI system.

Powered by Xilinx’s Adaptive Compute Acceleration Platform, the smart system will be designed to process and deliver advanced AI-powered automotive applications with high performance and low latency.

Sounds like Brainchip's accelerator !
An early clue to the Merc reveal.

Going back to the Sima ai thing.
How about this bloke. " Krishna Rangasayee " - founder and current CEO of Sima ai
Worked at Xilinx for 18 years. Was the Senior Vice President and GM of the overall business.
Left Xilinx August 2017 !
 
  • Like
  • Fire
  • Love
Reactions: 36 users

TECH

Regular
I have been comparing by close analysis the following two announcements.

The first announcement extract is from the Brainchip Prophesee press release.

The second announcement is from the SynSense Prophesee press release.

We have had much discussion on the first but not on the second but there is in my opinion a valuable insight to be had by doing so.

In the Brainchip Prophesee release Prophesee states “By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs”… This is a clear statement that responsibility for the marketing and commercialisation falls entirely to Prophesee. As we all would expect as the CEO Sean Hehir has removed all doubt that Brainchip is pure and simple an IP seller it does not make chips or sensors full stop.

Whereas in the press release by SynSense and Prophesee
the following is stated “The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications”. The commercialisation of their combined solution falls entirely to SynSense and is restricted to the IOT and Smarthome markets for detection and gesture control.

The difference is with respect between these to arrangements is highly significant or as Big Kev would say HUGE.


Prophesee has seen such value in AKIDA they have taken on all the responsibility for commercialisation and has not placed any restrictions on the markets where it will be offered to OEMs.

While on the other hand Prophesee has said to SynSense yes you can use our sensor and try to commercialise it in the IOT and Smarthome markets for detection and gesture and you can market our name with your solution.

These two partnerships are chalk (SynSense) and the best aged cheddar with a fine wine (Brainchip) in comparison.

Brainchip Prophesee joint press release (extract):

“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”



SynSense Prophesee press release (extract):

“Prophesee broke new ground with the industry first commercial neuromorphic-based vision sensing platform. Its Metavision solutions use an event-based vision approach which delivers significant improvement over traditional frame-based acquisition methods, reducing power and data processing requirements while allowing operation in the most demanding conditions.

Through this partnership SynSense will benefit from Prophesee’s sensing technology and wide existing network of partners. Prophesee will benefit from deep integration with SynSense’s novel processing technology, and world-leading developments in low-power vision applications.
Event-based vision is a paradigm-shifting advancement that is based on how the human eye records and interprets visual inputs. The sensors facilitate machine vision by recording changes in the scene rather than recording the entire scene at regular intervals. Specific advantages over frame-based approaches include better low light response (<1lux) and dynamic range (>120dB), reduced data generation (10x-1000x less than conventional approaches) leading to lower transfer/processing requirements, and higher temporal resolution (microsecond time resolution, i.e. >10k images per second time resolution equivalent).
With Prophesee’s patented Event-Based Metavision sensors, each pixel embeds its own intelligence processing enabling them to activate themselves independently, triggering events.

The partnership will address the design, development, manufacturing and commercialization of the combined neuromorphic technology, including sensors, processing solutions, software and solutions to enable a broad range of applications

The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications."

We all know thanks to @Diogenese that the SynSense solution might have a place but not as a competitor with Brainchip's AKIDA technology as it is very much a poor man's attempt at CNN neuromorphic computation.

This has been clearly recognised by Prophesee as while happy for SynSense to have a go they have recognised that Brainchip's AKIDA allows them to be "better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings"

PROPHESEE
HAVE TAKEN ON FULL RESPONSIBILITY FOR CREATING AND MARKETING THE AKIDA PROPHESEE SOLUTION.

My opinion only DYOR
FF

AKIDA BALLISTA

Excellent breakdown FF... the details, the details.

With only 18 business days left in the 3rd quarter, I'd be expecting some companies' financial controllers to be preparing some funds transfers into the Brainchip accounts, just to fatten up our 4C and confirm to the flock that, yes, the revenue flow has definitely started.

Any decline in figures to be released in late October, I'd be disappointed to be quite frank, let's hope that a few more layers are added to our snowball up in the Swiss Alps!

Tech.
 
  • Like
  • Love
  • Fire
Reactions: 52 users

Taproot

Regular
To me it feels like an " Edge Impulse / Prophesee " type of vibe. I think this mob will show up as a partner soon enough.
Just going back to @Bravo
Synopsys Corporate Overview ( May 22 ) with the picture of Brainchip's Accelerator.
Things started to happen around that time when the accelerator was released in 2017
This bloke has been mentioned in the past " Christoph Fritsch "
“BrainChip’s spiking neural network technology is unique in its ability to provide outstanding performance while avoiding the math-intensive, power-hungry, and high-cost downsides of deep learning in convolutional neural networks,” says Christoph Fritsch, senior director for the industrial, scientific, and medical business at Xilinx.

BrainChip representatives claim that the BrainChip accelerator is the first commercial implementation of a hardware-accelerated spiking neural network system.

This article was released in September 2017. Christoph Fritsch left Xilinx in November 2017 and is now and has been since the :

Director WW Solution Architects for Robotics, Smart Cities, IoT and Edge AI at NVIDIA

This article was released in May 2020

Leading US semiconductor company Xilinx and Stuttgart-based multinational automotive manufacturing firm Daimler AG have signed a new partnership to develop an in-car AI system.

Powered by Xilinx’s Adaptive Compute Acceleration Platform, the smart system will be designed to process and deliver advanced AI-powered automotive applications with high performance and low latency.

Sounds like Brainchip's accelerator !
An early clue to the Merc reveal.

Going back to the Sima ai thing.
How about this bloke. " Krishna Rangasayee " - founder and current CEO of Sima ai
Worked at Xilinx for 18 years. Was the Senior Vice President and GM of the overall business.
Left Xilinx August 2017 !
The Sima ai Chairman used to be the president and CEO of Xilinx
He left January 2018 !
He is also on the board of TSMC

Moshe Gavrielov joined the SiMa.ai board of directors in November 2019 and was named Chairman of the Board in 2021. Moshe also currently serves on the board of TSMC and is Chairman of the Board at Foretellix, Ltd. (a VC funded company). Previously, Moshe served as President and CEO of Xilinx, Inc. from January 2008 to January 2018 and as Director from February 2008 to January 2018. During his tenure at Xilinx, Moshe delivered revenue growth to over $2.5B, significantly increased profitability, and market share expansion. As a result, market capitalization quadrupled and approached $20B. Moshe was also Executive Chairman of Wind River Systems, Inc. (a TPG private equity owned company) until it was acquired by Aptiv.
 
  • Like
  • Fire
  • Wow
Reactions: 26 users

Slymeat

Move on, nothing to see.
Social media ‘likes’ do they in and of themselves tell us anything worthy of being used to make an investment decision. The answer is a raging NO.

The use of the ‘like’ button is completely meaningless or so full of subtext that it is still meaningless unless you canvass the person who hit ‘like’ as to why they did so.

I am sure at least once a day just as I do you will read a post here with multiple ideas expressed therein and one of the many ideas will appeal for some reason but the others not so and you give a ‘like’ because either you can’t be bothered posting or are pressed for time or you just don’t post.

Anyway the following linked article gives a little insight into the meaningless world of likes and their subtext. One of the subtext meanings I find interesting is where a ‘like’ is intended to convey good luck with that plan loser.

Have you ever asked yourself could Rob Telson have tried to sell AKIDA to one of these companies and was knocked back so his ‘like’ is intended to convey good luck with your outdated technology you loser:


So in the absence of something substantive my view is ‘likes’ are meaningless no matter how many are given unless there is something which clearly explains the reason for giving the ‘like’.

My opinion only DYOR
FF

AKIDA BALLISTA
I wrote such a witty and cunning response, but when I posted it, it got scrambled up, so I deleted it so as not to cause confusion. What a pity, it was more cunning than a fox that had just been appointed the Professor of Cunning at Oxford university.

I would have received so many likes that I probably would have wanted to see myself naked. You have to read the article to understand that reference.

But alas, the ether distorted my words, throwing most of them to the nether regions, or maybe I pressed a wrong key, and all is now lost like a good fart in a strong breeze.
 
  • Haha
  • Like
  • Love
Reactions: 20 users

Taproot

Regular
The Sima ai Chairman used to be the president and CEO of Xilinx
He left January 2018 !
He is also on the board of TSMC

Moshe Gavrielov joined the SiMa.ai board of directors in November 2019 and was named Chairman of the Board in 2021. Moshe also currently serves on the board of TSMC and is Chairman of the Board at Foretellix, Ltd. (a VC funded company). Previously, Moshe served as President and CEO of Xilinx, Inc. from January 2008 to January 2018 and as Director from February 2008 to January 2018. During his tenure at Xilinx, Moshe delivered revenue growth to over $2.5B, significantly increased profitability, and market share expansion. As a result, market capitalization quadrupled and approached $20B. Moshe was also Executive Chairman of Wind River Systems, Inc. (a TPG private equity owned company) until it was acquired by Aptiv.
And this bloke.
Lip-Bu - Sima ai board member.
This guy is on the Board's of SoftBank and Schneider Electric
5 days ago he was elected to the Board of Intel


Lip-Bu Tan is Founder and Chairman of Walden International (“WI”), a leading venture capital firm managing cumulative capital commitments of greater than $4 billion; and Founding Managing Partner of Celesta Capital and Walden Catalyst Ventures, a venture capital firm focused on investing in core technology companies. He formerly served as Chief Executive Officer of Cadence Design Systems, Inc., and now serves as Executive Chairman and has been a member of the Cadence Board of Directors since 2004. He currently serves on the Boards of Schneider Electric SE (SU: FP) and SoftBank Group (Japan: 9984.T). Lip-Bu focuses on semiconductor/components, cloud/edge infrastructure, data management and security, and AI/machine learning. Lip-Bu received his B.S. from Nanyang University in Singapore, his M.S. in Nuclear Engineering from the Massachusetts Institute of Technology, and his MBA from the University of San Francisco. Lip-Bu currently serves on the Board of Trustees at Fuller Theological Seminary and the School of Engineering Dean’s Council at Carnegie Mellon University (CMU), a member of the College of Engineering Advisory Board and Computing, Data Science, and Society Advisory Board at University of California Berkeley, Global Advisory Board Member of METI Japan, , and a member of The Business Council and Committee 100.
 
  • Like
  • Fire
  • Love
Reactions: 28 users
Poor old Tesla first Mercedes Benz beats it to more than 1,000 kilometres on a single charge by over 200 kilometres.

Now Lucid is going from 0 to 100 kph in less than 2 seconds.

No wait Tesla was the first to be approved for Level 3 autonomy no missed it again Mercedes Benz managed that one too and then the Honda Legend took second place. Third place is still up for grabs perhaps Ralph will put in a good word with the regulators.

 
  • Like
  • Haha
  • Fire
Reactions: 22 users

uiux

Regular
moved post to NASA thread:

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 23 users

Taproot

Regular
Aug 30 (Reuters) - Silicon Valley-based SiMa.ai, a machine learning startup backed by Fidelity Management & Research Company, said on Tuesday it started shipping chips and systems to customers that are testing them, an important step for mass production.

The product called MLSoC, short for machine learning system on chip, is designed to process video and images using machine learning and traditional computing on a single platform.


It is used for industrial robotics, drones, security cameras, satellite imaging and eventually self-driving cars.

SiMa.ai board members include Moshe Gavrielov, an independent director on the world's top chip maker Taiwan Semiconductor Manufacturing Co.'s (2330.TW) board, and Lip‑Bu Tan, who was elected to Intel's (INTC.O) board of directors, effective Sept. 1, 2022.

Krishna Rangasayee, SiMa.ai CEO and founder, said while data centers and cloud systems use cutting edge chips for machine learning applications, chip development for gadgets such as security cameras, drones has been slower despite the explosion of new internet connected devices.


"In my mind, it's a bigger market than the cloud. It's multi-trillion dollar market," he said.

Tan, who has also invested in SiMa.ai, said he was attracted by the company's strategy of building a software platform for the chip that supports all machine learning formats, making it easier for companies to use the chip in their products.

SiMa.ai semiconductors are manufactured at TSMC and used chip software tool Synopsys Inc (SNPS.O) to design the chip. Mass production is scheduled for the first quarter next year, said Rangasayee.
 
  • Like
  • Fire
  • Love
Reactions: 32 users

Rskiff

Regular
Poor old Tesla first Mercedes Benz beats it to more than 1,000 kilometres on a single charge by over 200 kilometres.

Now Lucid is going from 0 to 100 kph in less than 2 seconds.

No wait Tesla was the first to be approved for Level 3 autonomy no missed it again Mercedes Benz managed that one too and then the Honda Legend took second place. Third place is still up for grabs perhaps Ralph will put in a good word with the regulators.

Yeah that maybe the case @Fact Finder , but look at the profit margins Tesla makes on each vehicle and the amount of EV's they are producing now and continuing to ramp. Makes the others look rather pathetic imo
 
  • Like
  • Fire
Reactions: 9 users
And this bloke.
Lip-Bu - Sima ai board member.
This guy is on the Board's of SoftBank and Schneider Electric
5 days ago he was elected to the Board of Intel


Lip-Bu Tan is Founder and Chairman of Walden International (“WI”), a leading venture capital firm managing cumulative capital commitments of greater than $4 billion; and Founding Managing Partner of Celesta Capital and Walden Catalyst Ventures, a venture capital firm focused on investing in core technology companies. He formerly served as Chief Executive Officer of Cadence Design Systems, Inc., and now serves as Executive Chairman and has been a member of the Cadence Board of Directors since 2004. He currently serves on the Boards of Schneider Electric SE (SU: FP) and SoftBank Group (Japan: 9984.T). Lip-Bu focuses on semiconductor/components, cloud/edge infrastructure, data management and security, and AI/machine learning. Lip-Bu received his B.S. from Nanyang University in Singapore, his M.S. in Nuclear Engineering from the Massachusetts Institute of Technology, and his MBA from the University of San Francisco. Lip-Bu currently serves on the Board of Trustees at Fuller Theological Seminary and the School of Engineering Dean’s Council at Carnegie Mellon University (CMU), a member of the College of Engineering Advisory Board and Computing, Data Science, and Society Advisory Board at University of California Berkeley, Global Advisory Board Member of METI Japan, , and a member of The Business Council and Committee 100.
Just love this "and the School of Engineering Dean’s Council at Carnegie Mellon University (CMU),"

Great set of dots you have dragged together over these posts. Makes a very compelling argument sufficient to remove any surprise if we get an announcement confirming they are or will use AKIDA.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Adam

Regular
Already discussed here
Research chip
Completly different and inferior to akida on every level
No competition to akida
dragons den competition GIF by CBC
Thanks for the heads up. Had no doubt our chip was better.
 
  • Like
Reactions: 5 users
Yeah that maybe the case @Fact Finder , but look at the profit margins Tesla makes on each vehicle and the amount of EV's they are producing now and continuing to ramp. Makes the others look rather pathetic imo

True but they are one company and there are now many seasoned competitors with established sales networks and historically rusted on customers and China is an unstable market for any manufacturer to depend on and it is racing ahead with its own home grown brands to compete with Tesla in China. It would be quite silly to think that the race Tesla has been winning is the one it will be running in between now and 2030.

We constantly concern ourselves here with potential competitors even though we have a three year lead extending to 5 years either presently or shortly so why would the same concerns not have application to Tesla.

I was reading recently that there are Tesla owners who are now so disenchanted by Mr. Musk that they have said while they love their Tesla vehicle when they come to replace it they will happily look at other comparable brands.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 12 users

Diogenese

Top 20
I have been comparing by close analysis the following two announcements.

The first announcement extract is from the Brainchip Prophesee press release.

The second announcement is from the SynSense Prophesee press release.

We have had much discussion on the first but not on the second but there is in my opinion a valuable insight to be had by doing so.

In the Brainchip Prophesee release Prophesee states “By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs”… This is a clear statement that responsibility for the marketing and commercialisation falls entirely to Prophesee. As we all would expect as the CEO Sean Hehir has removed all doubt that Brainchip is pure and simple an IP seller it does not make chips or sensors full stop.

Whereas in the press release by SynSense and Prophesee
the following is stated “The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications”. The commercialisation of their combined solution falls entirely to SynSense and is restricted to the IOT and Smarthome markets for detection and gesture control.

The difference is with respect between these to arrangements is highly significant or as Big Kev would say HUGE.


Prophesee has seen such value in AKIDA they have taken on all the responsibility for commercialisation and has not placed any restrictions on the markets where it will be offered to OEMs.

While on the other hand Prophesee has said to SynSense yes you can use our sensor and try to commercialise it in the IOT and Smarthome markets for detection and gesture and you can market our name with your solution.

These two partnerships are chalk (SynSense) and the best aged cheddar with a fine wine (Brainchip) in comparison.

Brainchip Prophesee joint press release (extract):

“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”



SynSense Prophesee press release (extract):

“Prophesee broke new ground with the industry first commercial neuromorphic-based vision sensing platform. Its Metavision solutions use an event-based vision approach which delivers significant improvement over traditional frame-based acquisition methods, reducing power and data processing requirements while allowing operation in the most demanding conditions.

Through this partnership SynSense will benefit from Prophesee’s sensing technology and wide existing network of partners. Prophesee will benefit from deep integration with SynSense’s novel processing technology, and world-leading developments in low-power vision applications.
Event-based vision is a paradigm-shifting advancement that is based on how the human eye records and interprets visual inputs. The sensors facilitate machine vision by recording changes in the scene rather than recording the entire scene at regular intervals. Specific advantages over frame-based approaches include better low light response (<1lux) and dynamic range (>120dB), reduced data generation (10x-1000x less than conventional approaches) leading to lower transfer/processing requirements, and higher temporal resolution (microsecond time resolution, i.e. >10k images per second time resolution equivalent).
With Prophesee’s patented Event-Based Metavision sensors, each pixel embeds its own intelligence processing enabling them to activate themselves independently, triggering events.

The partnership will address the design, development, manufacturing and commercialization of the combined neuromorphic technology, including sensors, processing solutions, software and solutions to enable a broad range of applications

The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications."

We all know thanks to @Diogenese that the SynSense solution might have a place but not as a competitor with Brainchip's AKIDA technology as it is very much a poor man's attempt at CNN neuromorphic computation.

This has been clearly recognised by Prophesee as while happy for SynSense to have a go they have recognised that Brainchip's AKIDA allows them to be "better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings"

PROPHESEE
HAVE TAKEN ON FULL RESPONSIBILITY FOR CREATING AND MARKETING THE AKIDA PROPHESEE SOLUTION.

My opinion only DYOR
FF

AKIDA BALLISTA

Compare and contrast:
The combined vision processing solution will be co-marketed by both companies and commercialized by SynSense for addressing IoT and Smart Home detection and gesture control applications

By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs


From the Prophesee "Applications" page:

https://www.prophesee.ai/event-based-vision-applications/
1662441458933.png

Both Prophesee and Synsense are in the utomotive business:

https://www.prophesee.ai/event-based-vision-applications/

1662442233934.png



https://www.synsense-neuromorphic.com/solutions/

1662442350435.png




So has automotive been omitted from the Prophesee/Synsense partnership?
 
  • Like
  • Love
  • Fire
Reactions: 16 users
Top Bottom