BRN Discussion Ongoing

7für7

Regular
  • Like
  • Haha
  • Fire
Reactions: 6 users

Luppo71

Founding Member
Over Xmas @Luppo71 and I held a secret Brainchip meeting and we were the only ones to show up. I was quite disappointed as I was expecting Sean and TD to turn up.
Maybe a little too hush, hush.
We'll know next time, we'll get Elon to promote it.
 
  • Haha
  • Like
Reactions: 10 users
It’s good to hope, but as I said, better don’t count on Mercedes, otherwise you will be disappointed. I think a realistic use of akida in the Benz cars will start around the 2026.. until then, my focus will be the other partners just my opinion! Mercedes will for now decide to use already proven technology… I don’t think they will risk an image damage and\or huge recall campaign
You have missed my point.

It is not when Mercedes will start to deliver vehicles with Akida inside but more so once they come out and publicly confirm ongoing partnership and integration (as they have at CES with Nvidia) that will provide the catalyst for the initial movement in the SP. They have already advised that later this year they will be deciding "on a partner" to further develop their AI technology.

Regarding using already proven technology. We will have to also disagree here. I believe that Mercedes has been working to qualify our product and company over the last two years. Prior to this and their public announcement they would have already satisfied themselves to a large degree on the robustness, quality and reliability of our tech. It is my opinion that we are already well and truly "proven" in the eyes of Mercedes.

Of course this is all in my opinion and I may be wrong. Only time will provide the answer.
 
  • Like
  • Fire
  • Love
Reactions: 24 users
Is there something more mature you have to say? Or are you just a kid with cash from dad? Ask for a friend
1705981165688.gif
 
  • Haha
Reactions: 3 users
Actually I don’t think it needs that much training, in some use cases it may not need training at all - just a bit of configuration and an input stream to learn from.
This point you make about training is by itself a reason to watch this presentation as the moderator's reaction to how quickly the presenter trained AKIDA using the Edge Impulse platform and Meta TF is white gold:

https://www.youtube.com/live/W9JTcTJ4eBU?si=To2EEjhqZmz9UpF9

(Sorry had put up the wrong link. Hopefully this time.)

My opinion only DYOR
Fact Finder
 
Last edited:
  • Like
  • Love
Reactions: 13 users

7für7

Regular
You have missed my point.

It is not when Mercedes will start to deliver vehicles with Akida inside but more so once they come out and publicly confirm ongoing partnership and integration (as they have at CES with Nvidia) that will provide the catalyst for the initial movement in the SP. They have already advised that later this year they will be deciding "on a partner" to further develop their AI technology.

Regarding using already proven technology. We will have to also disagree here. I believe that Mercedes has been working to qualify our product and company over the last two years. Prior to this and their public announcement they would have already satisfied themselves to a large degree on the robustness, quality and reliability of our tech. It is my opinion that we are already well and truly "proven" in the eyes of Mercedes.

Of course this is all in my opinion and I may be wrong. Only time will provide the answer.
Hope your point of view is right. I just know how the Germans work and understand their way of talking on business matters. Every culture have its own way of expressing their steps of developing. I’m sure they see huge potential in akida.

And as i said, I just don’t give too much attention on Benz. That’s all…
 
  • Like
Reactions: 3 users

Moonshot

Regular
Hi Sera2g
Great summary. I noticed that to the right was the questions being asked by the audience and I was particularly taken by this series of questions by Gregor Lenz a prolific researcher who does a lot of stuff for NASA and Intel/Loihi. I have reproduced the questions but just his questions are gold. The answers are white gold but you will have to listen:

Gregor LenzWas this deployed to Akida? In MetaTF, you train conventional, non-spiking networks, and then convert post-training, is that correct?



ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

On a RASPBERRY ??



ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

face-blue-wide-eyes
face-blue-wide-eyes




qbbNBLKI7Zr4WLSkzoUW2prkouNUllcNydCQwiOZHvFtjWjA4ihFGl7ZF7f_Ll94oLLsNYoY=s32-c-k-c0x00ffffff-no-rj

Open NeuromorphicThanks for joining everyone!!
hand-pink-waving
Please like
👍🏻
👍🏽
👍🏿
! It will help the video reach more people!



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor LenzWhats the input data like? Is it frames at fixed frequency or frames at fixed event count?



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor Lenzyou re saying 5W for Akida + Raspberry Pi, do you know how much power each board used?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighdoes spiking neural networks works only with event based cameras?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighprofessor anxie gave awesome lectures in THI



AIf8zZSoivYbDFx0XDUbHNMreWrZvC7rAzHtfiZX6g=s32-c-k-c0x00ffffff-no-rj

Wojciech Rogala
hand-pink-waving




ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

Hi @Wojciech Rogala



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor Lenz If a synchronous edge chip that exploits sparsity like Akida burns just 5mW, is there a point of event-based, asynchronous systems?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighdo you think writing custom IP cores in HDL worths the effort for implementing custom snn?



AIf8zZRaHfRDB-MZrRlKPPK9kpbaXlYux0gQqAO8O7mA=s32-c-k-c0x00ffffff-no-rj

zanz Use cases are huge!

I have highlighted the gold question which is answered by the presenter and the last response by 'znanz' is in line with the gentleman who appeared on the podcasts with Nandan from Infineon and Edge Impulse.

My opinion only DYOR
Fact Finder
Great summaries, would also note Gregor is ex Synsense…
 
  • Like
  • Love
  • Fire
Reactions: 14 users

DK6161

Regular
I’d put some charts up, but most don’t really seem to contribute to that area on here and things get lost in the wilderness due to inactivity..
Put the charts up mate. We are down 6%. Need to know where this is heading. My uneducated guess is down again for the foreseeable future. FFS
 
  • Like
Reactions: 1 users
This point you make about training is by itself a reason to watch this presentation as the moderator's reaction to how quickly the presenter trained AKIDA using the Edge Impulse platform and Meta TF is white gold:



My opinion only DYOR
Fact Finder

He had me at “this guy’s absolutely amazing”
 
  • Sad
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Pretty sure this means Samsung are going to need an AKIDA Edge Box or two.


Samsung-Logic-Chips-2019.jpg

Samsung to build all-AI, no-human chip factories​

South Korean chip maker angling to apply smart sensors and remove human workers from the entire chip manufacturing process

By SCOTT FOSTER JANUARY 16, 2024


Samsung Electronics is planning to fully automate its semiconductor factories by 2030, with “smart sensors” set to control the manufacturing process, according to South Korean media reports. The world’s largest maker of memory chips aims to create an “artificial intelligence fab” that operates without human labor, the reports said. The ground-breaking project is reportedly already underway, the same reports said. Samsung has signaled since last summer it aims to AI to optimize integrated circuit (IC) design, materials development, production, yield improvement and packaging. Identifying the cause of defects in the production process is reportedly a top priority of the AI plan.



Screenshot 2024-01-23 at 3.46.50 pm.png

 
  • Like
  • Fire
  • Love
Reactions: 42 users
Put the charts up mate. We are down 6%. Need to know where this is heading. My uneducated guess is down again for the foreseeable future. FFS
Finally a decent comment..

We know BRN usually builds a 9month plus base before any good sustained moves. So I’d be counting 9 months from early Oct and then looking for some more reasonable rounding out price action.

if 14.5c can’t hold and price goes lower then start again from any new lows that come in..

You can try buying an undercut and rally of established lows such as price dropping and rising through 14.5c after more stops shaken out.. But you still run the risk of opportunity cost of a sideways pattern for 9-12 months again as the base building history of BRN will show..

So for me a from now, a best case sustained rising share price for BRN has the historical potential to start from July onwards… Maybe this also matches anticipation of revenue incoming this half to show up in the full FY accounts..

No buy signal here.. IPs will create a spiking share price but not sustained moves.. Need a lengthy sustained base to clear out the sellers, and then a steady flow of good financial Ann’s showing increasing revenues or new paying customers..

It’s up to the company to start showing this progress or it will keep suffering the consequences with a poor performing share price
 
  • Like
Reactions: 7 users
Having said all that DK, I did buy some today.. A small $20k amount in my SMSF.. If it halves I’ve only lost a few % in acct value..
 
Our ubiquitous process agnostic AI everywhere 3-5 years lead company is now -95% from its ATH while the entire sector is flying through the roof. Can we expect anything material to come this year or will it be another year of great progress and fantastic partnerships?
IMG_4706.jpeg
 
  • Like
  • Haha
  • Fire
Reactions: 15 users
Our ubiquitous process agnostic AI everywhere 3-5 years lead company is now -95% from its ATH while the entire sector is flying through the roof. Can we expect anything material to come this year or will it be another year of great progress and fantastic partnerships?
View attachment 54980
Are you taking the piss 🤔..
 
  • Haha
Reactions: 4 users
Our ubiquitous process agnostic AI everywhere 3-5 years lead company is now -95% from its ATH while the entire sector is flying through the roof. Can we expect anything material to come this year or will it be another year of great progress and fantastic partnerships?
View attachment 54980
That’s about a perfect summation.. Balanced and accurate apart from the objectification of animals
 
  • Like
  • Haha
Reactions: 8 users

Iseki

Regular
Hi Sera2g
Great summary. I noticed that to the right was the questions being asked by the audience and I was particularly taken by this series of questions by Gregor Lenz a prolific researcher who does a lot of stuff for NASA and Intel/Loihi. I have reproduced the questions but just his questions are gold. The answers are white gold but you will have to listen:

Gregor LenzWas this deployed to Akida? In MetaTF, you train conventional, non-spiking networks, and then convert post-training, is that correct?



ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

On a RASPBERRY ??



ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

face-blue-wide-eyes
face-blue-wide-eyes




qbbNBLKI7Zr4WLSkzoUW2prkouNUllcNydCQwiOZHvFtjWjA4ihFGl7ZF7f_Ll94oLLsNYoY=s32-c-k-c0x00ffffff-no-rj

Open NeuromorphicThanks for joining everyone!!
hand-pink-waving
Please like
👍🏻
👍🏽
👍🏿
! It will help the video reach more people!



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor LenzWhats the input data like? Is it frames at fixed frequency or frames at fixed event count?



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor Lenzyou re saying 5W for Akida + Raspberry Pi, do you know how much power each board used?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighdoes spiking neural networks works only with event based cameras?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighprofessor anxie gave awesome lectures in THI



AIf8zZSoivYbDFx0XDUbHNMreWrZvC7rAzHtfiZX6g=s32-c-k-c0x00ffffff-no-rj

Wojciech Rogala
hand-pink-waving




ZrcP0cR6DY1Kysemt5Aht0r-cOHkimw9Pw06SCWdWA9zHx_3M1u9Hqi0qPPSpZ58YigH6g12Nw=s32-c-k-c0x00ffffff-no-rj

Justin R

Hi @Wojciech Rogala



AIf8zZRwbrW89eLWJOaB5F23Vxk9rYABWb1LIxiRKVYP=s32-c-k-c0x00ffffff-no-rj

Gregor Lenz If a synchronous edge chip that exploits sparsity like Akida burns just 5mW, is there a point of event-based, asynchronous systems?



AIf8zZSFmJ5vAQ9tZ35oBHx5nO2xymismim226yA_Q=s32-c-k-c0x00ffffff-no-rj

bithighdo you think writing custom IP cores in HDL worths the effort for implementing custom snn?



AIf8zZRaHfRDB-MZrRlKPPK9kpbaXlYux0gQqAO8O7mA=s32-c-k-c0x00ffffff-no-rj

zanz Use cases are huge!

I have highlighted the gold question which is answered by the presenter and the last response by 'znanz' is in line with the gentleman who appeared on the podcasts with Nandan from Infineon and Edge Impulse.

My opinion only DYOR
Fact Finder
We need to get this Cristian on the payroll so we can have him all to ourselves.

I listened to the end which became very mathematical but gave the following - SNNs are special because they can model not just what you are looking for, but can include a model that keeps the system "antifragile". This means that as they learn more they won't suddenly fall over and give false results.
 
  • Like
  • Fire
  • Love
Reactions: 15 users
Top Bottom