Sony to make self-driving sensors that need 70% less power
"Sony plans to lower the amount of electricity needed in self-driving systems through edge computing, processing as much data as possible through AI-equipped sensors and software on the vehicles themselves instead of transmitting it to external networks. This approach is expected to shrink communication lags as well, making the vehicles safer."
"Sony will also incorporate image recognition and radar technologies into the new sensor, drawing on various types of data to facilitate self-driving even in rain and other difficult conditions.
The group controls almost half the global market for image sensors. It entered the automotive market in 2014, and aims to have dealings with 75% of key automakers worldwide by fiscal 2025."
Or could it be the battered savSad case of savaphobia.
BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”
“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”
METAVISION GEN4 EVALUATION KIT 2 - HD S-MOUNT
Prophesee Evaluation Kit 2 HD enables full performance evaluation of Generation 4.1 Event-Based Vision sensor co-developed with SONY Semiconductor Solutions,
featuring the industry’s smallest pixels and superior HDR performance.
![]()
Metavision for Machines
Inventor of the world’s most advanced neuromorphic vision systemswww.automate.org
Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance, and AR/VR. Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley. The company is driven by a team of more than 100 visionary engineers, holds more than 50 international patents and is backed by leading international investors including Sony, iBionext, 360 Capital Partners, Intel Capital, Robert Bosch Venture Capital, Supernova Invest, and European Investment Bank
Maybe the US company is going to be a partnership?Why would they want to know if you are already a shareholder in Brainchip???
I think they also asked if you're used to having shares in escrow. I think they like us.Why would they want to know if you are already a shareholder in Brainchip???
Possibly because BRN shareholders are renowned for being highly intelligent and long term holders.I think they also asked if you're used to having shares in escrow. I think they like us.
Maybe BRN will have a percentage of the company and all shareholders will get shares issued in escrow.I think they also asked if you're used to having shares in escrow. I think they like us.
In the spirit of encouraging input from our German friends away who also provide great research - I think you should just use Google translate.
Nviso just posted it on twitter. Mercedes as well!Ok. I personally haven't see this yet as just found it so....
I'm just gonna post a couple snips, links and a PDF....will let it speak for itself
Investor Fact Sheet | NVISO
NVISO are leading experts in artificial intelligence and deep learning to accurately detect and predict human behaviors using visual intelligence.www.ir.nviso.ai
NVISO Investor Interest - Formstack
nviso.formstack.com
View attachment 11821 View attachment 11822 View attachment 11823
SorryEnglisch Bitte
I think we’ve seen this announcement a while ago.What do you make out of this, posted today!
BrainChip selected by U.S. Air Force Research Laboratory to develop AI-based radar
2022-07-19 11:23 HKT
BrainChip, the world's first commercial producer of neuromorphic artificial intelligence chips and IP, today announced that Information Systems Laboratories (ISL) is developing an AI-based radar for the U.S. Air Force Research Laboratory (AFRL) based on its Akida™ Neural Network Processor Research solutions.
ISL is a specialist in expert research and complex analysis, software and systems engineering, advanced hardware design and development, and high-quality manufacturing for a variety of clients worldwide.
ISL focuses on areas such as advanced signal processing, space exploration, subsea technology, surveillance and tracking, cybersecurity, advanced radar systems and energy independence. As a member of the BrainChip Early Partnership Program (EAP), ISL will be able to evaluate boards for Akida devices, software and hardware support, and dedicated engineering resources.
"As part of BrainChip's EAP, we had the opportunity to directly assess the capabilities Akida offers to the AI ecosystem," said Jamie Bergin, Senior Vice President, Research, Development and Engineering Solutions Manager at ISL.
BrainChip brings AI to the edge in ways not possible with existing technologies. Akida processors feature ultra-low power consumption and high performance to support the development of edge AI technologies by using neuromorphic architecture, a type of artificial intelligence inspired by the biology of the human brain.
BrainChip's EAP program provides partners with the ability to realize significant benefits of power consumption, design flexibility and true learning at the edge.
"ISL's decision to use Akida and Edge-based learning as a tool to incorporate into their research and engineering solutions portfolio is in large part due to the go-to-market advantages our innovation capabilities and production-ready status provide ” said Sean Hehir, CEO of BrainChip, “We are delighted to be a partner of AFRL and ISL on edge AI and machine learning. We believe the combination of technologies will help accelerate the deployment of AI in the field.”
Akida is currently licensed as IP and is also available to order for chip production. It focuses on low power consumption and high performance, supports sensory processing, and is suitable for applications that benefit artificial intelligence, as well as applications such as smart healthcare, smart cities, smart transportation, and smart homes.
SnapAfternoon Tony Cole,
Yes it is great news.
This announcement was released on 10 January 2022, and for some strange reason the date keeps auto updating, to give the impression it is new info.
Still bloody good news though.
Regards,
Esq.
Agreed, let’s go RTI must say, I would be feeling a LOT more confident of a solid connection if Rob Telson just happened to all of a sudden "like" heaps of Sony stuff on LinkedIn. Wink wink nudge nudge Tony!!!
View attachment 11798
In the absence of any such examples I can only hope that the "contact" with Sony went extremely well back in 2019!
Extract from Update for the March 2019 Quarter - BrainChip
View attachment 11793
I sure hope so, my only concern is it maybe synsense working with Sony… it mentions Switzerland? Synsense is in ZurichNVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP achieving ultra-low latency (<1ms) processing at the edge
NVISO has successfully completed full interoperability of four of its AI Apps from its Human Behavioural AI catalogue to the BrainChip Akida neuromorphic processor achieving a blazing average model throughput more than 1000 FPS and average model storage of less than 140 KB.
www.nviso.ai![]()
Sony to make self-driving sensors that need 70% less power
"Sony plans to lower the amount of electricity needed in self-driving systems through edge computing, processing as much data as possible through AI-equipped sensors and software on the vehicles themselves instead of transmitting it to external networks. This approach is expected to shrink communication lags as well, making the vehicles safer."
"Sony will also incorporate image recognition and radar technologies into the new sensor, drawing on various types of data to facilitate self-driving even in rain and other difficult conditions.
The group controls almost half the global market for image sensors. It entered the automotive market in 2014, and aims to have dealings with 75% of key automakers worldwide by fiscal 2025."
What a day! The 1000 eyes have certainly had a field day! and GREEN too!
Makes you wonder why anyone would question whether or not those at Brainchip aren't doing there job.
One of these days BRN is going to explode, and you'd want to be hanging on to as many shares as you could!
DYOR not financial advice.
When you say "Nviso do not have eye tracking capabilities" perhaps you haven't seen the attached video. Watch from about the 2.25 mark.
I don't mind who they partner with but they definitely do have eye tracking. Since this video there have been many improvements.
Hey, sorry guys.
I do not want to throw the cat amongst the pigeons but suggest anyone thinking of investing with NVISO take a look at the thread highlighted to me by Rise a page or so back. It is a thread describing what are perhaps disgruntled, perhaps manipulative players who purport to have been involved in previous capital raising's and were disappointed with the outcome thus far.
It may be ancient history and it may be all bullshit, but I think it worth being aware of, in case there is any truth attached.
I had and have no further knowledge of these events beyond what I have read in that thread and sincerely hope that I am not spreading baseless rumours with this post, but suggest to DYOR on this matter.
GLTAH