BRN Discussion Ongoing

Diogenese

Top 20
@Diogenese or anyone else that knows - Can Akida be used as a chiplet?
It can but I remember PvdM being somewhat unenthusiastic or dismissive of the idea. One thing is that there needs to be communication between chiplets = mini-von Neumann bottlenecks?

That is why "at-the-sensor" (as a single SoC) would be preferred.
 
  • Like
  • Love
  • Fire
Reactions: 17 users

Foxdog

Regular
Afternoon Foxdog ,

You forgot to add ..

Global foundries have relatively recently been granted FULL GREEN LIGHT clearance to produce not only garden variety chips , but the USA's most technologically advanced chips for ALL MILLITARY APPLICATIONS.

Oh , and by the way thay just happen to of bought BrainChips latest chip to life , AKIDA 1500 in physical silicone..... refer to my last picture of physical silicone.

😃.

Not be long now.

Regards,
Esq.
Legend m8, well said 👌
 
  • Fire
  • Like
Reactions: 2 users

Terroni2105

Founding Member
It can but I remember PvdM being somewhat unenthusiastic or dismissive of the idea. One thing is that there needs to be communication between chiplets = mini-von Neumann bottlenecks?

That is why "at-the-sensor" (as a single SoC) would be preferred.
Thanks Dio and thanks Rise too.
 
  • Like
Reactions: 4 users

ndefries

Regular
Always good to re-read these especially when people like PVDM said the market doesn't understand the importance / value of this. MegaChip would not have incurred this cost if they did not have a plan and needed akida. They also can go and write seriously large business and we will only know through the financials if the customer keeps Akida secret. This hasn't happened yet but the day it does will be a massive bonfire of shorts.

I am so hopeful that they saw this going straight into a future Nintendo and the rest is history.
 
  • Like
  • Love
  • Fire
Reactions: 19 users

Jefwilto

Regular
Happy Fathers Day to our Tsx family,and if you dont have any children of your own we are all fathers/mothers of our babies “Akida” good things will come soon 😊
 
  • Like
  • Love
  • Fire
Reactions: 28 users
Happy Fathers Day to our Tsx family,and if you dont have any children of your own we are all fathers/mothers of our babies “Akida” good things will come soon 😊
Cheers mate, no kids here, use prophylactics but if it happens soon I'll name him Harry Houdini.
 
  • Haha
  • Like
Reactions: 7 users

Getupthere

Regular

 
  • Like
Reactions: 3 users

Foxdog

Regular
  • Haha
  • Love
Reactions: 2 users
Nothing is guaranteed.

Much like Tata Consultancy Services, Mercedes has been working on neuromorphic compute for over 5 years. BrainChip was invited to help implement the advantages of neuromorphic compute in their vehicles, not to understand it. Mercedes Benz would have been through an internal assessment process, and moved towards SNN architecture , then evaluated options on how to implement ( internal development, available commercial partners). My point being that they have been at it for a while.

I then reflect on this article from 6th Jan, 2022 .

Software-defined vehicles, software-based simulation and neural processors in EVs and connected cars, with a look at developments from General Motors, Mercedes-Benz and Blackberry.


The gist of the article is around how development times have improved enormously by using software simulation rather than building physical models. The VISION EQXX went from a white paper to a completed vehicle in only 18 months.

This means new, innovative concepts ( like neuromorphic compute ) that may have taken 5+ years to end up on the market in a vehicle should now only take 2 to 3 years.

This article, as we all know, went on to talk about Akida technology in the EQXX

.....

Neuromorphic computing for infotainment

This efficiency is not just being applied to enhancing range though. Mercedes-Benz also points out that its infotainment system uses neuromorphic computing to enable the car to take to “take its cue from the way nature thinks”.

View attachment 43674

Neuromorphic computing systems have the potential to radically reduce the energy needed to run the latest AI technologies in vehicles. (Image: Mercedes-Benz)

The hardware runs spiking neural networks, in which data is coded in discrete spikes and energy only consumed when a spike occurs, reducing energy consumption by orders of magnitude. In order to deliver this, the carmaker worked with BrainChip, developing the systems based on its Akida processor. In the VISION EQXX, this technology enables the “Hey Mercedes” hot-word detection five to ten times more efficiently than conventional voice control. Mercedes-Benz said although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

......

So, in Jan 2022 Mercedes was stating " just a few years ".
Does market release of the new Concept CLS class in say, June 2024 ( don't know release dates ), 2.5 years from this statement, fit with all of the evidence above and constitute " a few years ". I'm guessing it does.
Still no guarantees of course, but the timelines seem to fit.

I also like the fact that Merc continue to add to it's Team AI Research group.
With a focus on, you guessed it, spiking neural networks.
So I hope, and expect, that the footprint of Akida within MB vehicles will continue to evolve and grow over later releases.

This job was advertised on 7th August.

View attachment 43675

As an employee of the Team AI Research , you are researching current trends in artificial intelligence such as GenAI, quantum AI or neuromorphic computing. We are intensively examining new artificial intelligence methods and working on them further in order to make them usable for our company together with our colleagues from the specialist departments. We strive to expand the limits of artificial intelligence and create customized innovative solutions for our customers. In our team, you can actively shape the future of technology.


These challenges await you:

  • Applied research in the field of neuromorphic computing and spiking neural networks
  • Development of innovative algorithms for automotive applications
  • Implementation of neuromorphic algorithms for automotive applications
  • Researching and transferring current neuromorphic computing trends for automotive applications with a special focus on ADAS
I assume brainchip have been working with MB for nearly 6 years.
 

Attachments

  • Screenshot_20230903-153155_Chrome.jpg
    Screenshot_20230903-153155_Chrome.jpg
    579.4 KB · Views: 163
  • Like
  • Fire
  • Love
Reactions: 48 users

Rach2512

Regular



First off this is 5 days old, lots of comments re reduction in code...

Could someone more technical than me maybe post a comment to highlight the capabilities of Akida. Stephen (the presenter) himself comments that neural networks are beyond him. He has a lot of followers, perhaps he might do a dive into Brainchip? Also the Bull on the news interview says tech is on a new run, worth a watch. Could we be a fit with Tesla?
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Rach2512

Regular



First off this is 5 days old, lots of comments re reduction in code...

Could someone more technical than me maybe post a comment to highlight the capabilities of Akida. Stephen (the presenter) himself comments that neural networks are beyond him. He has a lot of followers, perhaps he might do a dive into Brainchip? Also the Bull on the news interview says tech is on a new run, worth a watch. Could we be a fit with Tesla?


Also just to note Steven states that the video posted by Elon had 11 million views, that could be 11 million more people now aware of neural networks for the first time, maybe not all but a good percentage of them.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Rach2512

Regular
Unrelated, when I write a post, if I want to go back and edit it, it won't let me, is anyone else having the same problem.
 
  • Haha
Reactions: 1 users
Unrelated, when I write a post, if I want to go back and edit it, it won't let me, is anyone else having the same problem.
Refresh the page and try again and yes to your question. That fixes it for me
 
  • Like
Reactions: 3 users

Rach2512

Regular

Rach2512

Regular
I'm still having the same problem.
I know it's probably user error, I've never had this problem before though.
 
  • Haha
  • Like
  • Thinking
Reactions: 4 users

Rach2512

Regular
Not sure mate but it's a recent thing for me also but refreshing or closing then reopening browser normally works for me.
Thanks, I'll close off then and restart, hopefully that will, not that I post very much
 
  • Haha
  • Like
Reactions: 2 users

HopalongPetrovski

I'm Spartacus!
Thanks, I'll close off then and restart, hopefully that will, not that I post very much
Switching things off and then back on again is about as technical as I get too.
Just have to hope the RAM gremlins and the cache monkeys haven't written down anywhere that they hate me in the meantime. 🤣
Good luck and God speed to us all tomorrow and here's hoping the shorter's all fall over each other trying to get out the door after the trading halt announcing our massive deal with xxx gets dropped in the morning. 🤣
Or maybe that's scheduled for Tuesday morning?
Will have to check the log on the ol wayback machine. 🤣

 
Last edited:
  • Like
  • Love
  • Haha
Reactions: 12 users

jtardif999

Regular
This video could be quite significant. I’m getting a little excited about the way the presenter describes this technology - it sounds very familiar..

I think back to a couple of years ago when Elon ditched Radar and LiDAR to just use cameras and maybe that was because as displayed in the video the NN can work directly with the millions of videos that had already been recorded on the open road (over some 10 years). To be able to be trained on just the videos and not out on the road itself and then to be able to apply what has been learned in the videos to the real world suggests an intelligence and REAL learning capability not available in ANY form of conventional computer architecture. BrainChip has already demonstrated this kind of intelligence with Akida - when training Akida on a picture of a tiger in the wild and then showing it a toy tiger standing on a table and having Akida recognise the toy as the tiger in the picture, that is real and UNIQUE intelligence. AIMO.
I’m starting to think like FJ that perhaps this is not as it appears to be. Having watched the uncut version of the video I’m leaning towards it being an LLM model created using semantic segmentation against all the video that Tesla have accumulated and which sits on a super computer used to train all their vehicles. As previously stated, references made about retraining on the super computer with selected video when a car performs badly does not indicate self learning, more it describes a process of federated learning - which is not really learning at all just very organised, efficient-training. It could seem human like in much the same way that ChatGPT responses seem human like until you dig a little deeper, which perhaps explains the cars behaviour in the demo. There is a further video about the demo which is a summary and which I don’t subscribe to its notion that somehow the training with just video and ordinary cameras will do away with LiDAR and radar. I think there will always be a need for these sensors. They, Tesla, didn’t demonstrate this tech under adverse conditions, or even at night - so the jury would be out on that. I also think that a hardware upgrade containing Akida would help the cause re power efficiency and local intelligence and hell even for one-shot learning! 🤓
 
  • Like
  • Fire
  • Love
Reactions: 16 users

Diogenese

Top 20
I’m starting to think like FJ that perhaps this is not as it appears to be. Having watched the uncut version of the video I’m leaning towards it being an LLM model created using semantic segmentation against all the video that Tesla have accumulated and which sits on a super computer used to train all their vehicles. As previously stated, references made about retraining on the super computer with selected video when a car performs badly does not indicate self learning, more it describes a process of federated learning - which is not really learning at all just very organised, efficient-training. It could seem human like in much the same way that ChatGPT responses seem human like until you dig a little deeper, which perhaps explains the cars behaviour in the demo. There is a further video about the demo which is a summary and which I don’t subscribe to its notion that somehow the training with just video and ordinary cameras will do away with LiDAR and radar. I think there will always be a need for these sensors. They, Tesla, didn’t demonstrate this tech under adverse conditions, or even at night - so the jury would be out on that. I also think that a hardware upgrade containing Akida would help the cause re power efficiency and local intelligence and hell even for one-shot learning! 🤓
I agree.

If it does not recognize a paper bag or a ball, how is it going to be able to know what to do?

A couple of points about the video:

1. It was done in full sunlight, no rain, fog, ...

2. How much power does it use?

As you say, it will be a large image database, so power consumption will be massive - the safest car in the world, but it only goes 10 km between charges.

If the database is contained in the vehicle for cloud-free operation, the weight of the electrons in the database will probably exceed the road weight limits.
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 29 users
Top Bottom