BRN Discussion Ongoing

"Exciting developments in the world of TinyML MCU-Accelerators for 2024! 🚀 Discover cutting-edge solutions tailored for your needs:
🔷 Nuvoton Technology Corporation's NuMicro M55M1 with ARM CortexM55 and Ethos U55: https://lnkd.in/d5RNj_Yp
🔷 Alif Semiconductor Ensemble Family - Dual Core CortexM55 + Dual Ethos U55, already in mass production: https://lnkd.in/dv-Hrkgt
🔷 Infineon Technologies's PSoC Edge devices, powered by high-performance Arm Cortex-M55 and Ethos-U55, including Helium™ DSP support: https://lnkd.in/dVMupP-8
🔷 Himax Technologies, Inc.'s WiseEye2 (WE2) AI solution featuring Arm Cortex M55 CPU and Ethos U55 NPU: https://lnkd.in/dmy423Qr
🔷 Renesas Electronics introduces the first CortexM85 with built-in AI capabilities: https://lnkd.in/dWjVyDS4 & https://lnkd.in/dtRBHTfc
If you're navigating these choices and need guidance, feel free to reach out! 🤖

View attachment 53827
Unfortunately, Renesas linked with PlumerAi on the RA MCU and M85.

However, drawing a long bow, as not really read up on PlumerAi, was the Renesas tape out for a 3rd party as I read it and the subsequent BRN update about Akida being able to work with the M85 part of this....hmmmm, I wish :unsure:



Renesas Launches New RA8DI MCUs Group​

Renesas’ Arm® Cortex®-M85 Processor-Based Devices Offer Ultra High Performance, Graphics Capabilities and Leading-Edge Security for Building Automation, Smart Home, Consumer and Medical Applications​




Screenshot_2024-01-09-18-10-44-33_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Sad
Reactions: 6 users

Diogenese

Top 20
Thankfully I personally don't mix electronics and drive :ROFLMAO:
'sfunny - the old engineers who were there on the cusp of the changeover from valves to transistors would always caution the young engineers not to mix valves and transistors ... not that I ever met one who was tempted to, what with the difference in operating voltages - but that was well before the days of 0.5.

Even at 0.8, you could still have a few drinks and still drive. In fact I was a much better driver at 0.8 than I ever was at 0.5.
 
  • Haha
  • Like
Reactions: 26 users

IloveLamp

Top 20
  • Like
  • Thinking
  • Fire
Reactions: 9 users

Dougie54

Regular
Unfortunately, Renesas linked with PlumerAi on the RA MCU and M85.

However, drawing a long bow, as not really read up on PlumerAi, was the Renesas tape out for a 3rd party as I read it and the subsequent BRN update about Akida being able to work with the M85 part of this....hmmmm, I wish :unsure:



Renesas Launches New RA8DI MCUs Group​

Renesas’ Arm® Cortex®-M85 Processor-Based Devices Offer Ultra High Performance, Graphics Capabilities and Leading-Edge Security for Building Automation, Smart Home, Consumer and Medical Applications​




View attachment 53828
I’m thinking the same
 
  • Like
Reactions: 4 users

Proga

Regular

AI Assistants, In-Car Gaming And Tons Of Range: Coming To Your Next Mercedes-Benz​

Mercedes has some bold, experimental ideas for how your driving experience could feel in the next few years. Will customers bite?​

Mercedes-Benz Tech CES Announcements 2024


Jan 8, 2024 at 9:00pm ET
0
Patrick George
By: Patrick George


The fight for automotive supremacy in the future won't come down to horsepower or mechanical specs. It will probably come down to software; at least, that's what the car companies tell us. That idea may seem radical to anyone driving a car from the last decade who's just happy that their Apple CarPlay or Android Auto setup works so well, but all of the automakers have much bigger, grander plans for what we experience inside their cars than we currently do.
Case in point: Mercedes-Benz's big announcements at CES 2024 aren't about performance or even electric range this time, but the future software-driven features coming to its cars very soon.

Gallery: Mercedes-Benz CES 2024 Tech Announcements​

MBUX Sound Drive
18 Photos
MBUX Sound Drive Mercedes-Benz Tech CES Announcements 2024 Mercedes-Benz Tech CES Announcements 2024 Mercedes-Benz Tech CES Announcements 2024 Mercedes-Benz Tech CES Announcements 2024 Mercedes-Benz Tech CES Announcements 2024 Mercedes-Benz Tech CES Announcements 2024
The giant Las Vegas tech show served as a preview for a bevy of Mercedes technologies coming to both its EVs and internal combustion-powered cars starting next year, including its smartest voice-controlled virtual assistant ever thanks to AI; more expansions into in-car gaming; a taste of its new all-encompassing suite, MB.OS; the North American debut of its future electric sedan and Tesla Model 3 competitor; and a unique system co-developed with musician will.i.am that "allows music to react to how the car is being driven," in Mercedes' words. (More on that in its own story right here.) Let's dive in.
Top Videos:


















https://insideevs.com/news/702994/porsche-taycan-nurburgring-2024/?minutetv=true


https://insideevs.com/news/702938/winter-tire-awd-fwd-explained/?minutetv=true
2
The Great Debate: Do You Really Need AWD Or Will Winter Tires And FWD/RWD Suffice?
WATCH FULL VIDEO
https://insideevs.com/reviews/703109/tesla-cybertruck-exposed-video-tour-review/?minutetv=true


https://insideevs.com/news/702917/tesla-cybertruck-steep-hill-off-road/?minutetv=true


https://insideevs.com/news/703512/tesla-cybertruck-ram-trx-snowy-drag-race/?minutetv=true


Powered by

Get Fully Charged​

In-car software and AI is the big story at CES this year
After years of EV and concept car debuts, automakers like Mercedes, Volkswagen and BMW are focused on upping their software game at the tech mega-show in 2024. All car companies have huge plans for revenue through software features, including via subscriptions.

MB.OS' Party Trick Is Its Virtual Assistant​

Mercedes-Benz Tech CES Announcements 2024

If you've been fortunate enough to experience Mercedes' MBUX software system across all three parts of the Hyperscreen on certain EQ electric cars, you know how powerful and graphically impressive it is. (At least, that's my take.) Equally impressive are the voice controls, where you can say "Hey, Mercedes" and get a swift response to most things you need without having to fish through the Hyperscreen's menus when you're on the road.
To that, Mercedes says: "You ain't seen nothing yet." The MB.OS system coming soon to the company's all-new MMA platform is a purpose-built chip-to-cloud operating system that brings a number of diverse functions under one roof: in-car content, charging, maps, car setting controls and automated driving.
Mercedes-Benz Tech CES Announcements 2024

A big part of that will be the MBUX Virtual Assistant, which Mercedes calls " the most human-like interface with a Mercedes-Benz yet." Powered by ChatGPT—something Volkswagen announced today at CES as well—Mercedes' Chief Technology Officer Markus Schäfer said the system can do more than just follow your voice commands. It can have a "dialogue" with users, and even pick up on the moods they're in to respond with "empathy."
The AI has four different personality traits—Natural, Predictive, Personal and Empathetic—that can be adjusted to your personal tastes or situation behind the wheel. The AI Projects as a glowing three-pointed star logo with various color changes and animations to illustrate its state of being. It could glow a certain way when providing a warning, for example, and a different way when it's listening to you.
"In some cases, hopefully, the assistant predicts already what you want, depending on your moods, the daytime, the location you are in and the circumstances you are in," Schäfer said. But after an easy conversation, Schäfer said the assistant "can address whatever you want from your car, whether you want entertainment, you want information and so on."

Mercedes-Benz Tech CES Announcements 2024

The assistant has four different "emotions" and can respond to your requests based on your own mood at the time. "You can have a constant interaction with your car," Schäfer told InsideEVs at a press roundtable earlier today.
In other words, this system should do a lot more than just tell you where the nearest charging station is when you say "Hey, Mercedes." It may also commiserate with you about how annoyed you are to be stuck in traffic.

Surround Navigation Turns Your Driving Experience Into A Video Game​

3D Navigation

These days, it's not uncommon for high-tech EVs to have displays with visualizations of the cars and objects in the surrounding area; Tesla's center screen is arguably the best example of this. But Mercedes is taking this to a different level.
Using the Unity game engine, the next-gen cars will offer MBUX Surround Navigation: a 3D visualization that allows you to "see what the car sees," as Mercedes puts it. Though your eyes are good at what they do, this system gives more of a bird's-eye view of other cars, trucks, cyclists or even pedestrians—superimposing potential hazards onto the route guidance to help you navigate a tough urban environment beyond just what you yourself can see.
"So for example, navigating through downtown traffic, finding a parking spot, knowing where you're going to make your turn"—all are projected via Surround Navigation, Mercedes' Chief Software Officer Magnus Östberg said during the press conference. The 3D game engine projects "all the sensor data that we're taking from the cameras, the radar, the LIDAR, that really gives the driver situational awareness. It's not only beautiful graphics but also a safe experience," he said.
Though we weren't offered the chance to experience this for ourselves at CES, it does seem like an impressive evolution of the navigation experience—and a good visualization of much of the data the car normally keeps to itself.

More In-Car Gaming Soon​

Mercedes-Benz Tech CES Announcements 2024

Mercedes also knows you need something to do while you're charging, and at least one answer to that is the video games I had when I was a kid. Bubble Bobble is back, kids. Get excited.
Mercedes announced a wide array of partnerships bringing new music, movies, video games and other content to those giant screens soon. Those include Audible and Amazon Music for podcasts and audiobooks, as well as Antstream Arcade for retro cloud-based gaming. It's not far off what you can already do with your smartphone, but the native access to these services should provide a nice experience on some much bigger, more high-res screens when you have downtime in the car.

The Next CLA Makes Its North American Debut​

Concept CLA Live CES

Though some early iterations of these features are available on the new 2024 Mercedes E-Class, which runs a "precursor" to MB.OS, the real show starts when the car above hits the road next year. That's the Concept CLA Class, which first debuted at IAA Munich in 2023 and is making its first-ever appearance in North America for CES.
The final production version of the next CLA isn't expected to veer too far from the concept here, and when it does, it will not only be a Model 3 fighter—something badly needed in an SUV-focused world—but it will debut MB.OS for the first time. That car will also be the first of at least four EVs on the new MMA platform, which, together with MB.OS, represents a much more cohesive transformation of Mercedes than we've seen before. By putting all these technologies together with one software platform, the automaker says it can offer a level of features and personalization it's never been able to before.
And the next CLA is expected to be a mighty EV, not just a computer on wheels. With an 800-volt architecture, an estimated 466 miles of range on Europe's WLTP cycle and the ability to add an estimated 248 miles of range in 15 minutes, this should be a winner if it even come close to those claims. Furthermore, with an estimated 5.2 miles per kilowatt-hour, it should be one of the most efficient EVs on the road.
We'll see this and more from Mercedes as CES goes on this week.
Contact the author: patrick.george@insideevs.com




The below is disappointing Bravo. I was expecting the full blown version in the 2024 model. I'm not sure if the 2024 model will have the new MBUX etc or the 5 microcontrollers. Might have to wait another year :(

Though some early iterations of these features are available on the new 2024 Mercedes E-Class, which runs a "precursor" to MB.OS, the real show starts when the car above hits the road next year. That's the Concept CLA Class, which first debuted at IAA Munich in 2023 and is making its first-ever appearance in North America for CES.
 
  • Like
  • Sad
Reactions: 6 users

MDhere

Regular
Unfortunately, Renesas linked with PlumerAi on the RA MCU and M85.

However, drawing a long bow, as not really read up on PlumerAi, was the Renesas tape out for a 3rd party as I read it and the subsequent BRN update about Akida being able to work with the M85 part of this....hmmmm, I wish :unsure:



Renesas Launches New RA8DI MCUs Group​

Renesas’ Arm® Cortex®-M85 Processor-Based Devices Offer Ultra High Performance, Graphics Capabilities and Leading-Edge Security for Building Automation, Smart Home, Consumer and Medical Applications​




View attachment 53828
Akida assists the M series 👍
 
  • Like
  • Fire
Reactions: 6 users
Akida assists the M series 👍
Agree but having a look at PlumerAi they seemed to be working on BNN not SNN.

Not sure if that changed but was what they were blogging about back 21 and is some of the info on the ARM site too.
 
  • Like
Reactions: 1 users

Sirod69

bavarian girl ;-)
Akida assists the M series 👍

Renesas to Demonstrate First AI Implementations on the Arm Cortex-M85 Processor Featuring Helium Technology at Embedded World​

March 9, 2023 -- TOKYO, Japan ― Renesas Electronics Corporation (TSE:6723), a premier supplier of advanced semiconductor solutions, today announced that it will present the first live demonstrations of artificial intelligence (AI) and machine learning (ML) implementations on an MCU based on the Arm® Cortex®-M85 processor.


BrainChip integrates Akida with Arm Cortex-M85 Processor, Unlocking AI Capabilities for Edge Devices​


Laguna Hills, Calif. – March 12, 2023 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced it has validated that its Akida™ processor family integrates with the Arm® Cortex®-M85 processor, unlocking new levels of performance and efficiency for next-generation intelligent edge devices.





 
  • Like
  • Love
  • Thinking
Reactions: 17 users

Diogenese

Top 20

Renesas to Demonstrate First AI Implementations on the Arm Cortex-M85 Processor Featuring Helium Technology at Embedded World​

March 9, 2023 -- TOKYO, Japan ― Renesas Electronics Corporation (TSE:6723), a premier supplier of advanced semiconductor solutions, today announced that it will present the first live demonstrations of artificial intelligence (AI) and machine learning (ML) implementations on an MCU based on the Arm® Cortex®-M85 processor.


BrainChip integrates Akida with Arm Cortex-M85 Processor, Unlocking AI Capabilities for Edge Devices​


Laguna Hills, Calif. – March 12, 2023 BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced it has validated that its Akida™ processor family integrates with the Arm® Cortex®-M85 processor, unlocking new levels of performance and efficiency for next-generation intelligent edge devices.





1 + 1 = 0
 
  • Haha
  • Like
  • Thinking
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
1 + 1 = 0
But Renesas are going to announce the new 22nm with BrainChip shortly, don’t you think? They just didn’t want to do it too close to their M-85/arm offering.
 
  • Like
  • Love
Reactions: 8 users

Sirod69

bavarian girl ;-)
  • Haha
  • Love
  • Like
Reactions: 11 users
I‘m patiently waiting for some Akida related announcement, fabricated in 7nm (Nandan mentioned this process node for an energy consumption example in an video/interview some time ago).
 
  • Like
  • Thinking
Reactions: 3 users

Diogenese

Top 20
why 0, when will 1 + 1 come together?
Who Knows Idk GIF by The Bachelor Australia
Hi Sirod,

The Akida family are all compatible with ARM's M85, but they are not part of M85. They are optional extras.

Helium is basically ARM in-house software for machine learning and digital signal processing.

https://www.arm.com/products/silicon-ip-cpu/cortex-m/cortex-m85

Arm Cortex-M85 is the highest performing Cortex-M processor with Arm Helium technology and provides the natural upgrade path for Cortex-M based applications that require significantly higher performance and increased security.

Renesas has said that they will use their in-house DRP-AI for "heavy" AI loads, and Akida will be used for simple AI tasks. It would be surprising if Renesas were to produce the high performance M85 with only 2 nodes of Akida.

Brainchip has said that they are unable to say when the Renesas processor incorporating Akida will be produced. That is entirely within the control of Renesas.
 
Last edited:
  • Like
  • Fire
  • Sad
Reactions: 20 users
So, have we got an offshoot, rebranding, part the restructure process on NVISO?

BeEmotion.ai.

I thought the layout, images and graphics looked similar when I looked at their homepage initially and digging deeper find some of what I snipped below.

When I scrolled to the bottom of the page I can see the site managed by Tropus here in Perth. I know exactly where their office is as drive past it for certain client meetings.

There is also a small Japanese link at the bottom which takes you to the Japanese site.






1704809475154.png


HUMAN BEHAVIOUR AI​

NEUROMORPHIC COMPUTING​

BeEmotion empowers system integrators to build AI-driven human machine interfaces to transform our lives using neuromorphic computing. Understand people and their behavior in real-time without an internet connection to make autonomous devices safe, secure, and personalized for humans.


NEUROMORPHIC COMPUTING INTEROPERABILITY​

ULTRA-LOW LATENCY WITH LOW POWER​


ULTRA-LOW LATENCY (<1MS)​

Total BeEmotion Neuro Model latency is similar for GPU and BrainChip Akida™ neuromorphic processor (300 MHz), however CPU latency is approximately 2.4x slower. All models on all platforms can achieve <10ms latency and the best model can achieve 0.6ms which is almost 2x times faster than a GPU. On a clock frequency normalization basis, this represents an acceleration of 6x.

HIGH THROUGHPUT (>1000 FPS)​

BeEmotion Neuro Model performance can be accelerated by an average of 3.67x using BrainChip Akida™ neuromorphic processor at 300MHz over a single core ARM Cortex A57 as found in a NVIDIA Jetson Nano (4GB) running at close to 5x the clock frequency. On a clock frequency normalization basis, this represents an acceleration of 18.1x.

SMALL STORAGE (<1MB)

BeEmotion Neuro Models can achieve a model storage size under 1MB targeting ultra-low power MCU system where onboard flash memory is limited. Removing the need for external flash memory saves cost and power. BrainChip Akida™ format uses 4-bit quantisation where ONNX format uses Float32 format.

DESIGNED FOR EDGE COMPUTING

NO CLOUD REQUIRED

woman_car_2-1024x423.jpg

privacy.jpg

PRIVACY PRESERVING

By processing video and audio sensor data locally it does not have to be sent over a network to remote servers for processing. This improves data security and privacy as it can perform all processing disconnected from the central server, which is a more secure and private architecture decreasing security risks
 
  • Like
  • Fire
  • Love
Reactions: 56 users

Frangipani

Regular
why 0, when will 1 + 1 come together?
Who Knows Idk GIF by The Bachelor Australia

Let me confuse you even more:
As you will surely recall, sometimes 2 x 3 equals 4 and 3 x 3 = 6! 😉


 
  • Haha
  • Love
  • Thinking
Reactions: 5 users
So, have we got an offshoot, rebranding, part the restructure process on NVISO?

BeEmotion.ai.

I thought the layout, images and graphics looked similar when I looked at their homepage initially and digging deeper find some of what I snipped below.

When I scrolled to the bottom of the page I can see the site managed by Tropus here in Perth. I know exactly where their office is as drive past it for certain client meetings.

There is also a small Japanese link at the bottom which takes you to the Japanese site.






View attachment 53840

HUMAN BEHAVIOUR AI​

NEUROMORPHIC COMPUTING​

BeEmotion empowers system integrators to build AI-driven human machine interfaces to transform our lives using neuromorphic computing. Understand people and their behavior in real-time without an internet connection to make autonomous devices safe, secure, and personalized for humans.


NEUROMORPHIC COMPUTING INTEROPERABILITY​

ULTRA-LOW LATENCY WITH LOW POWER​


ULTRA-LOW LATENCY (<1MS)​

Total BeEmotion Neuro Model latency is similar for GPU and BrainChip Akida™ neuromorphic processor (300 MHz), however CPU latency is approximately 2.4x slower. All models on all platforms can achieve <10ms latency and the best model can achieve 0.6ms which is almost 2x times faster than a GPU. On a clock frequency normalization basis, this represents an acceleration of 6x.

HIGH THROUGHPUT (>1000 FPS)​

BeEmotion Neuro Model performance can be accelerated by an average of 3.67x using BrainChip Akida™ neuromorphic processor at 300MHz over a single core ARM Cortex A57 as found in a NVIDIA Jetson Nano (4GB) running at close to 5x the clock frequency. On a clock frequency normalization basis, this represents an acceleration of 18.1x.

SMALL STORAGE (<1MB)

BeEmotion Neuro Models can achieve a model storage size under 1MB targeting ultra-low power MCU system where onboard flash memory is limited. Removing the need for external flash memory saves cost and power. BrainChip Akida™ format uses 4-bit quantisation where ONNX format uses Float32 format.

DESIGNED FOR EDGE COMPUTING

NO CLOUD REQUIRED

woman_car_2-1024x423.jpg

privacy.jpg

PRIVACY PRESERVING

By processing video and audio sensor data locally it does not have to be sent over a network to remote servers for processing. This improves data security and privacy as it can perform all processing disconnected from the central server, which is a more secure and private architecture decreasing security risks
Nice to see ZF listed under the client / partner section like Brainchip.

Wouldn't mind a hook up with ZF.

Geez....they even have the BRN iceberg already :ROFLMAO:


Screenshot_2024-01-09-22-39-16-41_4641ebc0df1485bf6b47ebd018b5ee76.jpg
 
  • Like
  • Haha
  • Love
Reactions: 21 users

Diogenese

Top 20
So, have we got an offshoot, rebranding, part the restructure process on NVISO?

BeEmotion.ai.

I thought the layout, images and graphics looked similar when I looked at their homepage initially and digging deeper find some of what I snipped below.

When I scrolled to the bottom of the page I can see the site managed by Tropus here in Perth. I know exactly where their office is as drive past it for certain client meetings.

There is also a small Japanese link at the bottom which takes you to the Japanese site.






View attachment 53840

HUMAN BEHAVIOUR AI​

NEUROMORPHIC COMPUTING​

BeEmotion empowers system integrators to build AI-driven human machine interfaces to transform our lives using neuromorphic computing. Understand people and their behavior in real-time without an internet connection to make autonomous devices safe, secure, and personalized for humans.


NEUROMORPHIC COMPUTING INTEROPERABILITY​

ULTRA-LOW LATENCY WITH LOW POWER​


ULTRA-LOW LATENCY (<1MS)​

Total BeEmotion Neuro Model latency is similar for GPU and BrainChip Akida™ neuromorphic processor (300 MHz), however CPU latency is approximately 2.4x slower. All models on all platforms can achieve <10ms latency and the best model can achieve 0.6ms which is almost 2x times faster than a GPU. On a clock frequency normalization basis, this represents an acceleration of 6x.

HIGH THROUGHPUT (>1000 FPS)​

BeEmotion Neuro Model performance can be accelerated by an average of 3.67x using BrainChip Akida™ neuromorphic processor at 300MHz over a single core ARM Cortex A57 as found in a NVIDIA Jetson Nano (4GB) running at close to 5x the clock frequency. On a clock frequency normalization basis, this represents an acceleration of 18.1x.

SMALL STORAGE (<1MB)

BeEmotion Neuro Models can achieve a model storage size under 1MB targeting ultra-low power MCU system where onboard flash memory is limited. Removing the need for external flash memory saves cost and power. BrainChip Akida™ format uses 4-bit quantisation where ONNX format uses Float32 format.

DESIGNED FOR EDGE COMPUTING

NO CLOUD REQUIRED

woman_car_2-1024x423.jpg

privacy.jpg

PRIVACY PRESERVING

By processing video and audio sensor data locally it does not have to be sent over a network to remote servers for processing. This improves data security and privacy as it can perform all processing disconnected from the central server, which is a more secure and private architecture decreasing security risks
This is a great endorsement of Akida!

... and BeEmotion will need to do a lot of brand recognition work and it looks like we will be dragged along with their coattails. They need us to show the benefits of their product.

So nViso must have lost out on the trade mark front to the security software mob. Maybe they thought using lower case with the capital V was sufficient to distinguish them, but, apart from being a feeble point of difference, it doesn't help phonetically.
 
  • Like
  • Haha
Reactions: 28 users
This is a great endorsement of Akida!

... and BeEmotion will need to do a lot of brand recognition work and it looks like we will be dragged along with their coattails. They need us to show the benefits of their product.

So nViso must have lost out on the trade mark front to the security software mob. Maybe they thought using lower case with the capital V was sufficient to distinguish them, but, apart from being a feeble point of difference, it doesn't help phonetically.
Good point.

Could be that re the trademark.

Happy to get the great wraps as if they are selling their Neuro SDK with the recommendation of running it on Akida, given some of those quick stats, then coattails it is hopefully.
 
  • Like
Reactions: 15 users

Frangipani

Regular
So, have we got an offshoot, rebranding, part the restructure process on NVISO?

BeEmotion.ai.

I thought the layout, images and graphics looked similar when I looked at their homepage initially and digging deeper find some of what I snipped below.

When I scrolled to the bottom of the page I can see the site managed by Tropus here in Perth. I know exactly where their office is as drive past it for certain client meetings.

There is also a small Japanese link at the bottom which takes you to the Japanese site.






View attachment 53840

HUMAN BEHAVIOUR AI​

NEUROMORPHIC COMPUTING​

BeEmotion empowers system integrators to build AI-driven human machine interfaces to transform our lives using neuromorphic computing. Understand people and their behavior in real-time without an internet connection to make autonomous devices safe, secure, and personalized for humans.


NEUROMORPHIC COMPUTING INTEROPERABILITY​

ULTRA-LOW LATENCY WITH LOW POWER​


ULTRA-LOW LATENCY (<1MS)​

Total BeEmotion Neuro Model latency is similar for GPU and BrainChip Akida™ neuromorphic processor (300 MHz), however CPU latency is approximately 2.4x slower. All models on all platforms can achieve <10ms latency and the best model can achieve 0.6ms which is almost 2x times faster than a GPU. On a clock frequency normalization basis, this represents an acceleration of 6x.

HIGH THROUGHPUT (>1000 FPS)​

BeEmotion Neuro Model performance can be accelerated by an average of 3.67x using BrainChip Akida™ neuromorphic processor at 300MHz over a single core ARM Cortex A57 as found in a NVIDIA Jetson Nano (4GB) running at close to 5x the clock frequency. On a clock frequency normalization basis, this represents an acceleration of 18.1x.

SMALL STORAGE (<1MB)

BeEmotion Neuro Models can achieve a model storage size under 1MB targeting ultra-low power MCU system where onboard flash memory is limited. Removing the need for external flash memory saves cost and power. BrainChip Akida™ format uses 4-bit quantisation where ONNX format uses Float32 format.

DESIGNED FOR EDGE COMPUTING

NO CLOUD REQUIRED

woman_car_2-1024x423.jpg

privacy.jpg

PRIVACY PRESERVING

By processing video and audio sensor data locally it does not have to be sent over a network to remote servers for processing. This improves data security and privacy as it can perform all processing disconnected from the central server, which is a more secure and private architecture decreasing security risks

Great find, @Fullmoonfever!

76FA2DDF-4EB2-4A41-A92D-912A8752E90B.jpeg




The weird thing is - the old NVISO website www.nviso.ai is still active and lists the exact same job offer under “Careers” as www.beemotion.ai! Except for Head Of Product Bogdan Lazar’s email address, that is.
I think you may have discovered their rebranding before they went public with it!

Apparently, they are still headquartered at the same location, namely at EPFL (École Polytechnique Fédérale de Lausanne) Innovation Park in Switzerland, but also have offices in Tokyo, Dubai and Sydney.
(That info is missing on the new website, at least in the careers section).

The YPB ASX announcement dated July 27, 2023 said “Nviso SA was founded in 2009 in Switzerland with operations in Japan and recently, as part of its management and Board changes, has its Corporate Headquarters in Australia under NVISO Group Limited.”
So I assume they differentiate between their Lake Geneva research headquarters and their new head office in Sydney? I find it rather unusual that they don’t have a “Team” subpage on either website.


F67B3A4B-5868-4133-BA69-802EC95B6707.jpeg





16AE0031-6547-4663-A6F2-A353AC56B309.jpeg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 35 users

Frangipani

Regular
This is the research institute they are or will be collaborating with:


17C3F5E0-234E-4536-9246-92BA5070B260.jpeg


19836848-797C-456C-85B6-C866F2821B45.jpeg
 
  • Like
  • Love
Reactions: 13 users
Top Bottom