BRN Discussion Ongoing

53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
Suspect it should be within this one.

Ant61 launched with SMC Optimus and we are with them in their "Brain", yet to be switched on.



Space Machines Company: SMC launched its first Optimus OTV, which carried a suite of customer payloads to space. The 270-kg spacecraft can maneuver around to deposit its riders into specific orbits.

ANT61
29/02/2024
No alternative text description for this image

Today, our space hardware product, ANT61 Brain, has been successfully deployed by SpaceX on board the Space Machines Company Optimus-1 spacecraft.
The Brain is the world's first space-grade neuromorphic computer.
In the past, we used GPUs and TPUs for machine learning, but the neuromorphic technology pioneered by BrainChip is up to 5 times more energy-efficient; that's why we chose it for our computer that will analyse footage from engineering cameras, detecting anomalies in spacecraft operation.
The Brain will only be turned on later in the Optimus-1 mission, so stay tuned for future updates on the operation.

We want to take this opportunity to thank people who have been instrumental in making this mission and the ANT61 Brain product possible.
In line with our tradition, your names are written on the flight model that is now in orbit!
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Iseki

Regular
53 payloads went into space on this March transporter 10 mission including a mix of new and returning SpaceX customers... I wonder which of these company's we are waiting on to "turn it on and start doing the work"? https://spacenews.com/spacex-launches-tenth-transporter-rideshare-mission/
That will be Space Machines, an Australian startup, with its Optimus spacecraft, which it described as being the largest commercial spacecraft built to date in Australia. The spacecraft is designed to demonstrate on-orbit maneuvering and future servicing technologies. The Optimus uses the ANT61 Brain™ computer, which serves as the primary intelligent control for a series of repair and maintenance robots that will be used to remotely repair damaged space vehicles. Th ANT61 has an Akida1000 inside.
 
  • Like
  • Love
  • Fire
Reactions: 19 users

I’m not a techie, but for me there’s something cool in this comparative article

Industry titan AWS on the one hand, cloud based, data centres gpu’s et al’, to our chipper holding up the other side of the balance scales, low power, high efficiency SNN for processing at the edge……

Love it……

Let’s hope in the near future, theres a new familiar name in the realm of titans ……. Brainchip !
 
  • Like
  • Fire
Reactions: 23 users
Recent EETimes podcast with Synsense but some conversation / discussion after with Dr. Giulia D’Angelo from the Fortiss research institute in Munich, and Professor Ralph Etienne-Cummings of Johns Hopkins University which I was more interested in.

Couple of excerpts from the tail end and I do hope they manage to organise it for next season.


So I think folks have been thinking about that. And, just to kind of loop this all the way around, Tony recently I believe started with BrainChip—he’s the CTO at BrainChip. And, in fact, I think they announced a new product that they’re selling that is some edge machine-learning device that will, I imagine, take the kind of sensing data and then process it as well. But I don’t think it’s embedded. What I mean by embedded is I don’t think there’s any cameras associated with the chip itself, right? It’s more kind of a distributed system.

.......

SB: I’m hoping to get an interview with Prophesee for the next season when we’ll also be talking to a lot more companies including, hopefully, BrainChip, Rain Neuromorphics, and a few others. Sorry, Giulia.

.......

But that’s the way technology goes. There has to be diversity before you have consolidation. And consolidation is interesting, because we’ve already had consolidation between iniVation and SynSense—which is not surprising, because they have the sort of same founders and are in the same city. I’m wondering how much of that is going to happen, too, within the coming years amongst the others. Innatera and, you know, we’ve got Prophesee doing these deals with Qualcomm, we’ve got BrainChip.
 
  • Like
  • Fire
  • Thinking
Reactions: 31 users

Sirod69

bavarian girl ;-)
Low-Power Image Classification With the BrainChip Akida Edge AI Enablement Platform

 
  • Like
  • Love
  • Fire
Reactions: 23 users

Diogenese

Top 20
Recent EETimes podcast with Synsense but some conversation / discussion after with Dr. Giulia D’Angelo from the Fortiss research institute in Munich, and Professor Ralph Etienne-Cummings of Johns Hopkins University which I was more interested in.

Couple of excerpts from the tail end and I do hope they manage to organise it for next season.


So I think folks have been thinking about that. And, just to kind of loop this all the way around, Tony recently I believe started with BrainChip—he’s the CTO at BrainChip. And, in fact, I think they announced a new product that they’re selling that is some edge machine-learning device that will, I imagine, take the kind of sensing data and then process it as well. But I don’t think it’s embedded. What I mean by embedded is I don’t think there’s any cameras associated with the chip itself, right? It’s more kind of a distributed system.

.......

SB: I’m hoping to get an interview with Prophesee for the next season when we’ll also be talking to a lot more companies including, hopefully, BrainChip, Rain Neuromorphics, and a few others. Sorry, Giulia.

.......

But that’s the way technology goes. There has to be diversity before you have consolidation. And consolidation is interesting, because we’ve already had consolidation between iniVation and SynSense—which is not surprising, because they have the sort of same founders and are in the same city. I’m wondering how much of that is going to happen, too, within the coming years amongst the others. Innatera and, you know, we’ve got Prophesee doing these deals with Qualcomm, we’ve got BrainChip.
For pity sake! Let's hope @Bravo doesn't see this!
 
  • Haha
  • Like
Reactions: 9 users

Sirod69

bavarian girl ;-)
The BrainChip team wrapped up a successful presence at the Hardware Pioneers - Max24 event! Partnering with Edge Impulse, our booth saw a surge of interest in our technology, both from current and new enthusiasts. Alf Kuchenbuch highlighted the clear appeal of neuromorphic AI, emphasizing its orders of magnitude lower power demand. Stay tuned as BrainChip continues to expand into new frontiers, including space and beyond!


BrainChip and neruomorphic AI was celebrated at Hardware Pioneers - Max24!
Partnering with @EdgeImpulse, our booth was buzzing with attention.

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 42 users
Last edited:
  • Like
  • Love
Reactions: 22 users

IloveLamp

Top 20

By extending processing and storage capabilities to the edge, we can improve the latency and cyber security of smart systems. Because computing occurs closer to the data source, the risk of data leakage is reduced.

Furthermore, edge storage alleviates the strain on cloud infrastructure by allowing manufacturers to send only relevant data to their cloud solutions. This reduces storage costs while also lightening the load on cloud-based analytics.
1000016134.jpg
 
  • Like
  • Love
  • Fire
Reactions: 20 users

7für7

Regular
Germany closed green! 🤔 I’m scared
 
  • Haha
  • Like
Reactions: 7 users
  • Like
  • Love
Reactions: 30 users

Boab

I wish I could paint like Vincent
Interesting how AI Labs have some of the same partners as us.
Hopefully they will find some customers to use Akida 2.0 in the areas that that commented on at the release?
So much potential as we sometimes forget how big the ecosystem is.
1717202811130.png
1717202684961.png
 
  • Like
  • Fire
  • Thinking
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
An extract from Pat Gelsinger's (CEO Intel) interview with TIME on 3 May 2024.
Screenshot 2024-06-01 at 11.03.29 am.png




 
  • Like
  • Love
  • Fire
Reactions: 23 users
Question for the more technically intelligent on BRN and mobile phones.
How do you see Akida being used in mobile phones apart from a possible integration with the Prophesse camera.

Is there any opportunities with Cyber security or any other way to integrate Akida do you think ?.
 
Last edited:
  • Like
Reactions: 3 users

Diogenese

Top 20
Some time ago I responded to Magnus Ostberg (Mercedes Software guru) querying whether the water-cooled processor in the CLA concept car required the cooling because it was running NN software.

His enigmatic response was:

"STAY TUNED!"

His latest linkedin posts do nothing to dispel my interest in the use of Akida simulation software in MB.OS.

https://www.linkedin.com/feed/update/urn:li:activity:7201230087612940288/

we’ve enhanced the “Hey Mercedes” voice assistant so it can help with a wider range of everyday functions.

https://www.linkedin.com/feed/update/urn:li:activity:7202202756806230016/

"the design of MB.OS demands a different approach because we are decoupling the hardware and software innovation cycles and integration steps".

This could be interpreted to mean that the software is developing faster than the silicon, and the software is updatable, so it would make sense to use software simulations of evolving tech (Akida 2/TeNNs) until the IP was in a sufficiently stable state to move to silicon.

In fact the Akida 2/TeNNs patents were filed 2 years ago, and the EAPs would have been informed about Akida 2 then, so they would have been reluctant to commit to silicon as the silicon was still being developed. Indeed, the TeNNs concept would have been still in its early stages of development.

Mercedes would be anxious to implement the technology which was 5 to 10 times more efficient than alternatives in kws among other things, but they would not want to serve up "yesterday's" day-old bread rolls when they know there's a fresh bun in the oven.

Similarly, we have recently "discovered" that Valeo's Scala 3 does not appear to include Akida SoC, but it comes with software to process the lidar signals.

Akida 2 was ready for tape-out a little while ago, ie, the engineering samples, but it would not be ready for integration with some other processor (CPU/GPU) for some time, certainly not in time for the 2025 MB release ... and who the heck is doing the engineering samples/production run?!

This blurb from MB CES 2024 presentation confirms that they are using "Hey Mercedes" in software:

Mercedes-Benz heralds a new era for the user interface with human-like virtual assistant powered by generative AI (mbusa.com)

At CES 2024, Mercedes-Benz is showcasing a raft of developments that define its vision for the hyper-personalized user experience of the future – in-car and beyond. Headlining those is the new integrated MBUX Virtual Assistant. It uses advanced software and generative AI to create an even more natural and intuitive relationship with the car, with proactive support that makes the customer's life more convenient. This game-changing development takes the 'Hey Mercedes' voice assistant into a whole new visual dimension with Unity's high-resolution game-engine graphics. Running on the Mercedes-Benz Operating System (MB.OS) developed in-house, its rollout starts with the vehicles on the forthcoming MMA platform (Mercedes-Benz Modular Architecture). The Concept CLA Class, which celebrates its North American premiere at CES 2024, is based on this platform and likewise provides a preview of MB.OS.

The MBUX Virtual Assistant is a further development of the system first showcased in the VISION EQXX. It uses generative AI and proactive intelligence to make life as easy, convenient and comfortable as possible. For instance, it can offer helpful suggestions based on learned behavior and situational context
.

So they are using Unity's game-engine graphics, but a quick glance did not find any Unity kws/nlp patents.

One possible corollary of this is that, when the EQXX Akida reveal was made a couple of years ago, it was about the use of Akida simulation software and not the Akida 1 SoC, but I'd have to go back to check this.

In any event, it seems pretty clear that there is a distinct possibility that the Mercedes CLE MBUX is using Akida simulation software until the design is sufficiently mature to produce the silicon.
 
  • Like
  • Fire
  • Love
Reactions: 76 users
Question for the more technically intelligent on BRN and mobile phones.
How do you see Akida being used in mobile phones apart from a possible integration with the Prophesse camera.

Is there any opportunities with Cyber security or any other way to integrate Akida do you think ?.
Possibly LLM / SLM maybe imo.

Processed on device reducing time to do so.
 
  • Like
  • Love
Reactions: 5 users

IloveLamp

Top 20
1000016139.jpg
1000016137.jpg
1000016142.jpg
1000016145.jpg
 
  • Like
  • Love
  • Fire
Reactions: 60 users

Kachoo

Regular
Some time ago I responded to Magnus Ostberg (Mercedes Software guru) querying whether the water-cooled processor in the CLA concept car required the cooling because it was running NN software.

His enigmatic response was:

"STAY TUNED!"

His latest linkedin posts do nothing to dispel my interest in the use of Akida simulation software in MB.OS.

https://www.linkedin.com/feed/update/urn:li:activity:7201230087612940288/

we’ve enhanced the “Hey Mercedes” voice assistant so it can help with a wider range of everyday functions.

https://www.linkedin.com/feed/update/urn:li:activity:7202202756806230016/

"the design of MB.OS demands a different approach because we are decoupling the hardware and software innovation cycles and integration steps".

This could be interpreted to mean that the software is developing faster than the silicon, and the software is updatable, so it would make sense to use software simulations of evolving tech (Akida 2/TeNNs) until the IP was in a sufficiently stable state to move to silicon.

In fact the Akida 2/TeNNs patents were filed 2 years ago, and the EAPs would have been informed about Akida 2 then, so they would have been reluctant to commit to silicon as the silicon was still being developed. Indeed, the TeNNs concept would have been still in its early stages of development.

Mercedes would be anxious to implement the technology which was 5 to 10 times more efficient than alternatives in kws among other things, but they would not want to serve up "yesterday's" day-old bread rolls when they know there's a fresh bun in the oven.

Similarly, we have recently "discovered" that Valeo's Scala 3 does not appear to include Akida SoC, but it comes with software to process the lidar signals.

Akida 2 was ready for tape-out a little while ago, ie, the engineering samples, but it would not be ready for integration with some other processor (CPU/GPU) for some time, certainly not in time for the 2025 MB release ... and who the heck is doing the engineering samples/production run?!

This blurb from MB CES 2024 presentation confirms that they are using "Hey Mercedes" in software:

Mercedes-Benz heralds a new era for the user interface with human-like virtual assistant powered by generative AI (mbusa.com)

At CES 2024, Mercedes-Benz is showcasing a raft of developments that define its vision for the hyper-personalized user experience of the future – in-car and beyond. Headlining those is the new integrated MBUX Virtual Assistant. It uses advanced software and generative AI to create an even more natural and intuitive relationship with the car, with proactive support that makes the customer's life more convenient. This game-changing development takes the 'Hey Mercedes' voice assistant into a whole new visual dimension with Unity's high-resolution game-engine graphics. Running on the Mercedes-Benz Operating System (MB.OS) developed in-house, its rollout starts with the vehicles on the forthcoming MMA platform (Mercedes-Benz Modular Architecture). The Concept CLA Class, which celebrates its North American premiere at CES 2024, is based on this platform and likewise provides a preview of MB.OS.

The MBUX Virtual Assistant is a further development of the system first showcased in the VISION EQXX. It uses generative AI and proactive intelligence to make life as easy, convenient and comfortable as possible. For instance, it can offer helpful suggestions based on learned behavior and situational context
.

So they are using Unity's game-engine graphics, but a quick glance did not find any Unity kws/nlp patents.

One possible corollary of this is that, when the EQXX Akida reveal was made a couple of years ago, it was about the use of Akida simulation software and not the Akida 1 SoC, but I'd have to go back to check this.

In any event, it seems pretty clear that there is a distinct possibility that the Mercedes CLE MBUX is using Akida simulation software until the design is sufficiently mature to produce the silicon.
Hi Dio,

I'm aware that MB played with both chips Akida 1000 and as recently as last October Akida 1500. It could be for various trials so I understand your software point.

We also know that there has been a tonne of talk about the software in the last few year. We also know and have been told that many put Akida development on 1000 on hold for 2.0 which is much superior in performance and meets what the target audience want.

So this would highlight the exit of Chris Stevens comments in sales how they hardware product was not ready hence his departure.

So is it easy enough to implement the Akida 2.0 hardware once it reaches a stage of readiness?

Clearly MB has not abandoned BRN as we are trusted only thing is the relationship has not been clearly defined as its fluid.

So if we have valeo and MB using software it should generate revenue to degree but not as elevated as hardware sales.

So the big question is did Sean say we are not putting 2.0 out as they do not want to compete with a customer who ever this customer is would need to be pretty secure to wait.

As for the 1000 and 1500 they really should put more production in as they have demand or will it all now have to be 2.0 and the ones using Akida ESaa Ant61 VVDN Unigen and others have to purchase a 2.0 variant for the products?
 
  • Like
  • Fire
  • Thinking
Reactions: 14 users
Top Bottom