BRN Discussion Ongoing

Terroni2105

Founding Member
About NVISO

NVISO is an Artificial Intelligence company founded in 2009 and headquartered at the Innovation Park of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland. Its mission is to help teach machines to understand people and their behavior to make autonomous machines safe, secure, and personalized for humans. As leader in human behavioral AI, it provides robust embedded software solutions that can sense, comprehend, and act upon human behavior in real-world environments deployed at the deep edge. It achieves this through real-time perception and observation of people and objects in contextual situations combined with the reasoning and semantics of human behavior based on trusted scientific research. NVISO’s technology is made accessible through ready-to-use AI solutions addressing Smart Mobility and Smart Health and Living applications (in-cabin monitoring systems, health assessments, and companion robot sensing) with a key focus on the deep and extreme edge with ultra-low power processing capabilities. With a singular focus on how to apply the most advanced and robust technology to industry and societal problems that matter, NVISO’s solutions help advance human potential through more robust and rich human machine interactions. ir.nviso.ai

involved in Smart Health, Smart Mobility, Smart Wealth

 
  • Like
  • Love
  • Fire
Reactions: 25 users
Trolling through their recent news feeds NVISO is working with some interesting companies it seems.



Another strong partnership with another premium company in my opinion. Well done Brainchip.
 
  • Like
  • Fire
  • Love
Reactions: 40 users


One of the co-founders is a Kiwi/Aussie!

A15E4123-3302-48A0-BB24-AC095BBFEB6D.jpeg

6ACB00CF-58B8-42A0-BCA8-B684F5E7BA46.jpeg

A5562F47-3B79-4268-814B-DD52021AC6C6.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 39 users

Boab

I wish I could paint like Vincent
If you want to stop worrying about competitors and whether Brainchip is moving fast enough I suggest reading the following article:

Toward Optoelectronic Chips That Mimic the Human Brain
Mon, 18 Apr 2022 19:00:01 +0000
a-glowing-chip-sits-in-the-middle-of-a-colorfully-rendered-illustration-of-a-brain-made-of-connecting-lines.jpg


The human brain, which is made up of some 86 billion neuronsconnected in a neural network, can perform remarkable feats of computing. Yet it consumes just a dozen or so watts. How does it do it?

IEEE Spectrum recently spoke with Jeffrey Shainline, a physicist at the National Institute of Standards and Technology in Boulder, Colo., whose work may shine some light on this question. Shainline is pursuing an approach to computing that can power advanced forms of artificial intelligence—so-called spiking neural networks, which more closely mimic the way the brain works compared with the kind of artificial neural networks that are widely deployed now. Today, the dominant paradigm uses software running on digital computers to create artificial neural networks that have multiple layers of neurons. These “deep” artificial neural networks have proved immensely successful, but they require enormous computing resources and energy to run. And those energy requirements are growing quickly: in particular, the calculations involved in training deep neural networks are becoming unsustainable.

Researchers have long been tantalized by the prospect of creating artificial neural networks that more closely reflect what goes on in networks of biological neurons, where, as one neuron accepts signals from multiple other neurons, it may reach a threshold level of activation that causes it to “fire,” meaning that it produces an output signal spike that is sent to other neurons, perhaps inducing some of them to fire as well.

“Compared with semiconductors, you can fit many more neurons and synapses on a wafer because you can stack in the third dimension. You can have maybe 10 layers, and that’s a big advantage.”
—Jeffrey Shainline, NIST

A few companies have produced chips for implementing electronic spiking neural networks. Shainline’s research focuses on using superconducting optoelectronic elements in such networks. His work has recently advanced from investigating theoretical possibilities to performing hardware experiments. He tells Spectrumabout these latest developments in his lab.

I’ve heard for years about neuromorphic processing chips from IBM and elsewhere, but I don’t get a sense that they have gained traction in the practical world. Am I wrong?

Jeffrey Shainline: Good question: Spiking neural networks—what are they good for?

IBM’s True North chip from 2014 made a big splash because it was new and different and exciting. More recently, Intel has been doing great things with its Loihi chip. Intel now has its second generation of that. But whether these chips will solve real problems remains a big question.

We know that biological brains can do things that are unmatched by digital computers. Yet these spiking neuromorphic chips don’t immediately knock our socks off. Why not? I don’t think that’s an easy question to answer.

One thing that I’ll point out is that one of these chips doesn’t have 10 billion neurons (roughly the number of neurons in a person’s brain). Even a fruit-fly brain has about 150,000 neurons. Intel’s most recent Loihi chip doesn’t even have that.

Knowing that they are struggling with what they’re going to do with this chip, the folks at Intel have done something clever: They’re giving academics and startups cheap access to their chip—for free in a lot of cases. They’re crowdsourcing creativity in hopes that somebody will find a killer app.

What would you guess the first killer app will be?

Shainline: Maybe a smart speaker, a speaker needs to be always on waiting for you to say some keyword or phrase. That normally requires a lot of power. But studies have shown that a very simple spiking neural algorithm running in one simple chip can do this while using almost no power.

Tell me about the optoelectronic devices that you and your NIST colleagues are working on and how they might improve spiking neural networks.

Shainline: First, you need to understand that light is going to be the best way that you can communicate between neurons in a spiking neural system. That’s because nothing can go faster than light. So using light for communication will allow you to have the biggest spiking neural network possible.

But it’s not enough to just send signals fast. You also need to do it in an energy-efficient manner. So once you’ve chosen to send signals in the form of light, the best energy efficiency you can achieve is if you send just one photon from a neuron to each of its synaptic connections. You can’t make an amount of light any less.

And the superconducting detectorswe are investigating are the best there is when it comes to detecting single photons of light—best in terms of how much energy they dissipate and how fast they can operate.

You could also build a spiking neural network that uses room-temperature semiconductors to send and receive optical signals, though. And right now, it isn’t obvious which strategy is best. But because I’m biased, let me share some reasons to pursue the superconducting approach.

Admittedly, using superconducting elements imposes a lot of overhead—you have to build everything in a cryogenic environment so that your devices remain cold enough to superconduct. But once you’ve done that, you can easily add another crucial element: something called a Josephson junction.

Josephson junctions are the key building block for superconducting computing hardware, whether they’re for superconducting qubits in a quantum computer, superconducting digital logic gates, or superconducting neurons.

Once you’ve decided to use light for communication and superconducting single-photon detectors to sense that light, you will have to build your computer in a cryogenic environment. So without further overhead, you now have Josephson junctions at your disposal.

And this brings a benefit that’s not obvious: It turns out that it is easier to integrate Josephson junctions in three dimensions than it is to integrate [MOSFETs—metal oxide semiconductor field-effect transistors] in three dimensions. That’s because with semiconductors, you fabricate MOSFETs on the lower plane of a silicon wafer. Then you put all your wiring layers up on top. And it becomes essentially impossible to put another layer of MOSFETs on top of that with standard processing techniques.

In contrast, it’s not hard to fabricate Josephson junctions on multiple planes. Two different research groups have demonstrated that. The same is true for the single-photon detectors that we’ve been talking about.

This is a key benefit when you consider what will be needed to allow these networks to scale into something resembling a brain in complexity. Compared with semiconductors, you can fit many more neurons and synapses on a wafer because you can stack in the third dimension. You can have maybe 10 layers, and that’s a big advantage.

The theoretical implications of this approach to computing are impressive. But what kind of hardware have you and your colleagues actually built?

Shainline: One of our most exciting recent results is the demonstration of the integration of superconducting single-photon detectors with Josephson junctions. What that allows us to do is receive single photons of light and use that to switch a Josephson junction and produce an electrical signal and then integrate the signals from many photon pulses.

We’ve recently demonstrated that technology here in our lab. We have also fabricated on a chip light sources that work at low temperatures. And we’ve spent a lot of time on the waveguides needed to carry light signals around on a chip, too.

I mentioned the 3D-integration—the stacking—that’s possible with this kind of computing technology. But if you’re going to have each neuron communicate to a few thousand other neurons, you would also need some way for optical signals to transition without loss from a waveguide in one layer to a waveguide in another layer. We’ve demonstrated that with as many as three stacked planes of these waveguides and believe we could extend that to 10 or so layers.

When you say “integration,” do you just mean that you’ve wired these components together, or do you have everything working on one chip?

Shainline: We have indeed combined superconducting single-photon detectors with Josephson junctions on one chip. That chip gets mounted on a little printed circuit board that we put inside a cryostat to keep it cold enough to remain superconducting. And we use fiber optics for communication from room temperature to low temperature.

Why are you so keen to pursue this approach, and why aren’t others doing the same?

Shainline: There are some pretty strong theoretical arguments as to why this approach to neuromorphic computing could be quite a game changer. But it requires interdisciplinary thinking and collaboration, and right now, we’re really the only group doing specifically this.

I would love it if more people got into the mix. My goal as a researcher is not to be the one that does all this stuff first. I'd be very pleased if researchers from different backgrounds contributed to the development of this technology”

My opinion only DYOR
FF

AKIDA BALLISTA
This reminds me of the story of Archer Materials (AXE). They have figured out a way of doing quantum computing at room temperature and like BRN Mr Shainline may not be aware of this?
AXE is a very exciting Co but I felt they were a long way off from producing a commercially available product and thats why I sold all my AXE shares to buy more BRN. As Shawn Hare (sic) said "watch the financials".
I'm excited.
 
  • Like
  • Fire
  • Love
Reactions: 17 users
AGM notice listed on the ASX. Details how to log in remotely and vote etc.

98 pages of info to devour. :)
 
  • Like
  • Wow
  • Love
Reactions: 16 users
involved in Smart Health, Smart Mobility, Smart Wealth

SMART WEALTH how appropriate is that beneficial use for all visionary Brainchip investors. I mostly hate marketing spin but I must say this is one for a T Shirt. 😂🤣

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Haha
  • Like
  • Fire
Reactions: 15 users
AGM notice listed on the ASX. Details how to log in remotely and vote etc.

98 pages of info to devour. :)
This is one to read closely as it is asking shareholders to approve a brand new Constitution not just amend the old one.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Wow
  • Fire
Reactions: 14 users

Euks

Regular
6168A68D-8D2A-41CB-B7F1-919794424E85.jpeg

Pretty good incentive for our CEO to achieve his performance criteria. 🤑
 
  • Like
Reactions: 17 users


Hopefully today’s announcement is only the tip of the NVISO iceberg - I see you Tony 👀

234A5D07-5951-4FC9-822F-ACB186854BE0.jpeg

E61DA683-E782-4F34-9831-A774F4E8D156.jpeg



02EC2E6F-88D8-4A12-8E67-FD1B6FE42E78.jpeg





PRESS RELEASE​

NVISO and PANASONIC sign license agreement to embed Human Behaviour AI in companion robots​

15th March, 2022​

NVISO signs a multi-year license agreement with PANASONIC for deployment of its Human Behavioural Analytics AI software solution in the PANASONIC Companion Robot NICOBO

Lausanne, Switzerland – March 15, 2022 – nViso SA (NVISO), the leading Human Behavioural Analytics AI company, is pleased to announce it has signed a production license agreement with Panasonic for the deployment of NVISO’s Human Behaviour AI Apps for the Panasonic NICOBO companion robot1 . The license includes the real-time deep learning-based AI Apps for face tracking, head pose recognition, facial recognition, and emotion recognition. These AI Apps are specifically designed for low-power and low-memory companion robot hardware platforms enabling “always-on” operation without requiring an internet connection.

The global market for Companion Robots was valued at USD 1.98 billion in 2020 and according to Mordor Intelligence is expected to reach USD 11.24 billion by 2026, registering a CAGR of 34.34% during the period of 2021-20262.Companion robots are designed to interact naturally with humans, with the ability to perceive and respond to a user’s mental state, behaviours and commands. With these capabilities they can assist in combating loneliness and detecting depression along with helping in keeping people healthy at home through the remote monitoring of vital signs. This is achieved through visual comprehension and NVISO’s human behavioural analytics AI systems have the capabilities to deliver this. NVISO’s solutions do this through its range of AI Apps providing visual observation, perception and semantic reasoning capabilities, the results of which can be used in identifying issues, in decision making processes and in supporting autonomous “human like” interactions.

Examples of these AI Apps provide the analysis of core signals of human behaviour such as facial expressions, emotions, identity, head pose, gaze, gestures, activities, and the identification of objects with which users interact. These AI Apps can be optimised for typically resource constrained low power and low-cost processing

platforms demanded by battery operated consumer products. Furthermore, NVISO AI Apps can be easily configured to suit any camera system for optimal performance in terms of distance and camera angle, and thanks to NVISO’s large scale proprietary human behaviour databases NVISO’s AI Apps are robust to real-world imaging conditions as often found in consumer homes across the world. Unlike cloud-based solutions, NVISO’s solutions do not require information to be sent off-device for processing elsewhere so user privacy and safety can be protected.

"Over the last two years we have worked with Panasonic’s development team to fulfil a vision to help people live at home healthier and happier through empathic intelligent devices. Our engineering team has delivered an excellent AI solution to meet the demanding power/cost/performance needs of high-volume edge-based consumer processing platforms required by manufacturers such as Panasonic”, said Tim Llewellynn, CEO of NVISO, “For several years, we have been focusing on partnerships and projects to integrate Visual Intelligence into deep learning accelerated hardware enabling breakthrough capabilities at mass production price points. Conclusion of this production license agreement with Panasonic provides additional evidence for the strong demand we are experiencing for the integration of advanced Human Behavioural Analytics technology into edge-based systems for a wide range of applications ranging from consumer products through to medical devices and autonomous automotive systems"

About NVISO, is an Artificial Intelligence ("AI") company founded in 2009 and headquartered at the Innovation Park of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland. Its mission is to help make autonomous machines safe, secure, and personalised for humans. As leader in human behavioural AI, it provides software solutions that can sense, comprehend, and act upon human behaviour in real-world environments. It achieves this through real-time perception and observation of people and objects in contextual situations combined with the reasoning and semantics of human behaviour based on trusted scientific research. NVISO’s technology is made accessible through ready-to-use AI solutions addressing Smart Mobility and Smart Health and Living applications (interior sensing, health assessments, and robot interactions) with a key focus on deployments to the deep edge. With a singular focus on how to apply the most advanced and robust technology to industry and societal problems that matter, NVISO’s solutions help advance human potential. www.nviso.ai

About PANASONIC, Panasonic Corporation is a global leader developing innovative technologies and solutions for wide-ranging applications in the consumer electronics, housing, automotive, and B2B sectors. The company, which celebrated its 100th anniversary in 2018, operates 522 subsidiaries and 69 associated companies worldwide and reported consolidated net sales of 6,698.8 billion yen for the year ended March 31, 2021. Committed to pursuing new value through collaborative innovation, the company uses its technologies to create a better life and a better world for customers. Learn more about Panasonic: https://www.panasonic.com/global

1. https://news.panasonic.com/global/stories/2021/88701.html
2. https://www.mordorintelligence.com/industry-reports/social-robots-market
 
  • Like
  • Love
  • Fire
Reactions: 48 users

JK200SX

Regular
Two companies that I've been following have been ZF and Seeing machines, and now I've discovered they are customers of NVISO.

ZF:
I remember seeing this post from a colleague on Linkedin a few months ago:

1650409891574.png


and upon searching the ZF ProAI Supercomputer, I found:

ZF ProAI: The Source of Vehicle Intelligence​



The intelligence of future vehicles will be controlled by a few extremely powerful central computers like the new ZF ProAI. It is the most flexible, scalable and powerful automotive-grade supercomputer on the market.

Modern vehicles feature increasingly more software functions. When software combines the intelligence of sensor information with the control of smart actuators, new powerful functions like driver assist systems emerge, which provide more comfort, more safety, or even more range when driving electrically. This is only possible with smart software algorithms. In traditional vehicle architectures, the bits and bytes act mainly in decentralized electronic control units (ECUs). In new and future vehicle platforms, the electric and electronic architecture (E/E architecture) will change dramatically. The computing power and the main software functions will be bundled in a few domain or zone controllers.

At Auto Shanghai 2021, the ZF Group presented the next generation of its automotive-grade supercomputer ZF ProAI for the first time. This central computer perfectly meets the manufacturers' requirements for the software-defined vehicles and their new E/E-architectures. It can serve as domain, zone or central controller.
“The ZF ProAI is currently the most flexible, scalable, and powerful automotive-grade supercomputer in the world,” said Oliver Briemle, Head of L4 Feature Development, Domain Control and V2X at ZF.
It is suitable for any vehicle type and for all levels of automated and autonomous driving: from Level 2 to Level 5.


Standardization meets scalability​


Customized high-performance solutions for vehicle intelligence, suitable to any vehicle platform, but based on a single product – that’s the new ZF ProAI.

“When developing the new ZF ProAI, we paid particular attention to two things: standardization and scalability. Because this simplifies the choice immensely when selecting ZF ProAI as the central computer for a vehicle: No matter what the application needs are – ZF has the ideal solution,” said Briemle.

Standardization and flexibility are also key for the processor and the connectors: The supercomputer’s modular set-up means it can be equipped with System-on-Chip (SoC) variants from different manufacturers – depending on the planned application and the computing power required. The connectors are compatible with all common plugs on the market. Also, ZF has relied on the highest level of flexibility regarding software. In addition to in-house solutions, the supercomputer also runs programs and operating systems of the OEMs or third-party developers. Depending on the desired performance, three cooling options are available: passive cooling, air cooling and liquid cooling. Nevertheless, the installation dimensions remain the same: The new ZF ProAI has a uniform and compact housing size of only 240 x 138 x 49 millimeters. This is significantly smaller than its previous models and thus gives more freedom in the choice of installation options in the vehicle.


“When developing the new ZF ProAI, we paid particular attention to two things: standardization and scalability.”
Oliver Briemle, Head of L4 Feature Development, Domain Control and V2X at ZF


More performance, less power consumption


The new ZF ProAI also comes with more performance than ever before. Depending on the desired application from Level 2 to Level 5 and the corresponding computing power required, ZF ProAI can execute between 20 trillion and one quadrillion computing steps – per second. With a performance of up to 250 TOPS per unit, that’s an increase of 66 percent compared to the previous model, the ZF ProAI RoboThink.

And while performance increases, power consumption drops even more: With 1 watt of power, the ProAI achieves a performance of approximately 3 TOPS. That's up to 70 percent less power consumption than before. And, as an automotive grade product, its high-tech interior is resilient and reliable even under harsh conditions. It also offers an ASIL-D computing performance with state-of-the-art protection against cyber threats.

Enabling new vehicle functions and tapping their full potential: The AI-capabilities of ZF ProAI are optimized for deep learning processes, further enhancing its ability to deliver advanced safety features. The board offers a 360° GPU-driven fusion of all available sensor data, including environmental measurement data from radars, LiDARs, cameras and audio patterns.

‎ZF also provides a measurement data interface (MDI) for ProAI to forward the collected sensor data unaltered to a central storage system for development and testing purposes. This makes it much easier for developers to train artificial intelligence for autonomous driving.

Given such possibilities, it is not surprising that orders are already being placed with ZF. The first volume production for the latest version of the ZF ProAI will start in 2024.






SEEINGMACHINES:


They are based in Canberra, are a customer of NVISO, and are also working with XILINX (mentioned previously in these threads) on their FOVIO automotive chip.
 
  • Like
  • Fire
  • Thinking
Reactions: 33 users

Taproot

Regular
Nviso collaboration is a good one.
Plenty of doors to unlock thru this partnership.
Definitely a player

Nviso + Cetera Financial

Nviso + Seeing Machines

Nviso + Panasonic

Nviso + Paincheck

Nviso + Tobii DMS

Nviso + BNZ ( Bank of New Zealand )

Nviso + F&P Robotics
 
  • Like
  • Fire
  • Love
Reactions: 57 users
  • Like
  • Fire
  • Love
Reactions: 27 users

mcm

Regular
Nviso collaboration is a good one.
Plenty of doors to unlock thru this partnership.
Definitely a player

Nviso + Cetera Financial

Nviso + Seeing Machines

Nviso + Panasonic

Nviso + Paincheck

Nviso + Tobii DMS

Nviso + BNZ ( Bank of New Zealand )

Nviso + F&P Robotics
Such a shame this isn't (or can't be) announced on the ASX.
 
  • Like
  • Sad
  • Love
Reactions: 18 users
HI Dhm
Maybe not this time. I have sent this article to Tony Dawe who is responsible for having Macquarie initiate coverage of Brainchip. It is a project he has had running to bring them on board. Congratulations to Tony.

The purpose of sending the article was to suggest it might be something to send to his Macquarie contacts to undermine the suggestion in their report that IBM - True North and Intel - Loihi are competitors in this life.

The beauty of this article is that the solution being explored gives you hypothermia in your lounge room not to mention frost bite if you touch the TV remote and only Elon Musk and Jeff Bezos can afford it.

I have also sent it to my other contact on the basis that they might add him to their list of academics to bring into the Brainchip fold.

In other words don’t bite him on the heal just yet. LOL He is so well qualified that he is a very useful resource.

Regards
FF

AKIDA BALLISTA
Hi @Dhm

Just heard from Tony Dawe and this link is on its way to Macquarie and "a few more".

Great believer in lateral thinking. Sometimes your competitor can be your greatest friend in this high tech world if their solution is impractical, expensive and a very long way off.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 33 users
Nviso collaboration is a good one.
Plenty of doors to unlock thru this partnership.
Definitely a player

Nviso + Cetera Financial

Nviso + Seeing Machines

Nviso + Panasonic

Nviso + Paincheck

Nviso + Tobii DMS

Nviso + BNZ ( Bank of New Zealand )

Nviso + F&P Robotics
A great resource generously shared. Many thanks @Taproot it has already been copied and sent off to my network. LOL FF
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Murphy

Life is not a dress rehearsal!
It says there are 73 people waiting to watch this link. I wonder how many of them are us? 👀
Loved the quote "we want a Mercedes to constantly learn..."

If you don't have dreams, you can't have dreams come true!
 
  • Like
  • Love
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Two companies that I've been following have been ZF and Seeing machines, and now I've discovered they are customers of NVISO.

ZF:
I remember seeing this post from a colleague on Linkedin a few months ago:

View attachment 4590

and upon searching the ZF ProAI Supercomputer, I found:

ZF ProAI: The Source of Vehicle Intelligence​



The intelligence of future vehicles will be controlled by a few extremely powerful central computers like the new ZF ProAI. It is the most flexible, scalable and powerful automotive-grade supercomputer on the market.

Modern vehicles feature increasingly more software functions. When software combines the intelligence of sensor information with the control of smart actuators, new powerful functions like driver assist systems emerge, which provide more comfort, more safety, or even more range when driving electrically. This is only possible with smart software algorithms. In traditional vehicle architectures, the bits and bytes act mainly in decentralized electronic control units (ECUs). In new and future vehicle platforms, the electric and electronic architecture (E/E architecture) will change dramatically. The computing power and the main software functions will be bundled in a few domain or zone controllers.

At Auto Shanghai 2021, the ZF Group presented the next generation of its automotive-grade supercomputer ZF ProAI for the first time. This central computer perfectly meets the manufacturers' requirements for the software-defined vehicles and their new E/E-architectures. It can serve as domain, zone or central controller.
“The ZF ProAI is currently the most flexible, scalable, and powerful automotive-grade supercomputer in the world,” said Oliver Briemle, Head of L4 Feature Development, Domain Control and V2X at ZF.
It is suitable for any vehicle type and for all levels of automated and autonomous driving: from Level 2 to Level 5.


Standardization meets scalability​


Customized high-performance solutions for vehicle intelligence, suitable to any vehicle platform, but based on a single product – that’s the new ZF ProAI.

“When developing the new ZF ProAI, we paid particular attention to two things: standardization and scalability. Because this simplifies the choice immensely when selecting ZF ProAI as the central computer for a vehicle: No matter what the application needs are – ZF has the ideal solution,” said Briemle.

Standardization and flexibility are also key for the processor and the connectors: The supercomputer’s modular set-up means it can be equipped with System-on-Chip (SoC) variants from different manufacturers – depending on the planned application and the computing power required. The connectors are compatible with all common plugs on the market. Also, ZF has relied on the highest level of flexibility regarding software. In addition to in-house solutions, the supercomputer also runs programs and operating systems of the OEMs or third-party developers. Depending on the desired performance, three cooling options are available: passive cooling, air cooling and liquid cooling. Nevertheless, the installation dimensions remain the same: The new ZF ProAI has a uniform and compact housing size of only 240 x 138 x 49 millimeters. This is significantly smaller than its previous models and thus gives more freedom in the choice of installation options in the vehicle.


“When developing the new ZF ProAI, we paid particular attention to two things: standardization and scalability.”
Oliver Briemle, Head of L4 Feature Development, Domain Control and V2X at ZF


More performance, less power consumption


The new ZF ProAI also comes with more performance than ever before. Depending on the desired application from Level 2 to Level 5 and the corresponding computing power required, ZF ProAI can execute between 20 trillion and one quadrillion computing steps – per second. With a performance of up to 250 TOPS per unit, that’s an increase of 66 percent compared to the previous model, the ZF ProAI RoboThink.

And while performance increases, power consumption drops even more: With 1 watt of power, the ProAI achieves a performance of approximately 3 TOPS. That's up to 70 percent less power consumption than before. And, as an automotive grade product, its high-tech interior is resilient and reliable even under harsh conditions. It also offers an ASIL-D computing performance with state-of-the-art protection against cyber threats.

Enabling new vehicle functions and tapping their full potential: The AI-capabilities of ZF ProAI are optimized for deep learning processes, further enhancing its ability to deliver advanced safety features. The board offers a 360° GPU-driven fusion of all available sensor data, including environmental measurement data from radars, LiDARs, cameras and audio patterns.

‎ZF also provides a measurement data interface (MDI) for ProAI to forward the collected sensor data unaltered to a central storage system for development and testing purposes. This makes it much easier for developers to train artificial intelligence for autonomous driving.

Given such possibilities, it is not surprising that orders are already being placed with ZF. The first volume production for the latest version of the ZF ProAI will start in 2024.






SEEINGMACHINES:


They are based in Canberra, are a customer of NVISO, and are also working with XILINX (mentioned previously in these threads) on their FOVIO automotive chip.


PHOAR!

If ZF are a customers of NVISO then this makes a whole lot of sense (or sensors called AKIDA in them) !!!! 🥳


#8,140


1649987818089.png
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 34 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
One of the co-founders is a Kiwi/Aussie!

Imagine what a surprise Tim’s going to get when I rock up on his doorstep and start chewing his ears off about BrainChip!

Screen Shot 2022-04-20 at 11.22.04 am.png
 

Attachments

  • Screen Shot 2022-04-20 at 11.22.04 am.png
    Screen Shot 2022-04-20 at 11.22.04 am.png
    260 KB · Views: 94
Last edited:
  • Haha
  • Like
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Don't want to jump to too many conclusion here, but it seems increasingly likely IMO that Panasonic are one of our EAP's?

Oh, and I underlined "companion robot" because it reminded me of what Rob Telson said about robot Ken needing a companion dog.


Screen Shot 2022-04-20 at 11.46.15 am.png
 
  • Like
  • Fire
  • Love
Reactions: 48 users

Proga

Regular
Why anybody here thought that the eqs contained AKIDA baffles me.

It has just been announced in the eqxx, a concept car. Features from concept cars take time to trickle down. The eqs is ready for mass production and would have been designed and engineered well before the eqxx was even a thing.

We need to be realistic here.
People are getting confused. Some of us were hoping Hey Mercedes infotainment only being released with Akida in new models not the full package with Nvidia to be released in 2024. Looking likely Mercedes will wait until 2024 before using Akida in any commercial model with Akida incorporated right throughout the car.

Also means no revenue from Mercedes until 2024.
 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 7 users
Top Bottom